Wednesday, July 11, 2007

read later

http://blogs.msdn.com/ie/archive/2005/12/19/505564.aspx

http://en.wikipedia.org/wiki/Internationalized_domain_name

http://forums.devnetwork.net/viewtopic.php?t=53083
Unicode URI considerations

URLs must be compatible with the DNS system.
Therefore they are restricted to ASCII set characters.

To make Chinese URLs, for example, a system of representing the thousands of available characters is to use ascii character to represent their character codes. Such encodings use the % sign within the context, but probably not at the end of content. This could of course be used to obfusticate all sorts of things. IE7 and Opera show Punicode chars - transport in ascii but client representation in local unicode equivalents. Therefore we can not build the data or edit with Firefox at all?

URL Rewriting

.htaccess

RewriteEngine On
RewriteRule news_detail/(.*) news_detail.php?parameters=$1


in PHP
$params = array();
function router() {
global $params;
$exploded = explode("/", $_GET['"parameters"]);
for ($i=0;$i < count($explode);$i++){
$params[$explode[$i]]=$explode($i+1];
$i++;
}
}

}

Tuesday, March 13, 2007

Buffering Bug



A bug that can be observed to exist and one can believe that some setting or thing can be changed but 2 weeks of attempting to use every variation of command structure, and detailed analysis of possible causes goes nowhere certainly makes one consider other options.

If you stick with debugging for too long and the client needs a result it makes you think harder. The first serve worked fine, there was little difference with the second POST but it nearly always broke the XML result. It does not seem at all logical so the programmer gets it in the neck.

One such event that would not debug, after 10 days of infuriating and unjust buffer corruption investigations, I suddenly saw what was happening. The PHP system command was running for too long, and returned with a broken buffer. But only on the second load, never the first. We spent days looking for bugs in the XML, trying every option under the sun. Then it suddenly hit me and I spent 3 hours devising a method that would always work, and what remained was some front end browser scripting to keep the user satisfied things are happening while their data was arriving.

In a multitasking environment, there are two things to worry about when serving data from the same query. The server may queue the query and take time to supply the data, and it is more likely to do so if it is still channeling out data from the previous query. Not that that is a logical cause of the problem, but it may create an issue that excites the problem so that PHP thinks the second system call has had long enough and carries on with half an XML delivery. Half an XML is nothing.

It can not be fixed with the “right settings”.

When you offload a buffer to a file it absorbs it - in one gulp. When you offload a buffer to a browser, it gets chopped into TCP segments which are delivered less quickly. As soon as a large delivery has bits of time added to it, it changes the delivery rules.

The problem is that the process that is held up by the slow direct to browser delivery so the OS reduces the IO priority for that process. What you must be able to do with a content stream that is potentially megabytes in one delivery is send it as fast as the line can take it. But not after a chopped delivery.

If you send it slower – by latching the browser process to the file creation process, if there is any significant quantity of data, you are telling the operating system that your data is slow and therefore to give it the priority and rules of a slower line. Your next query on that channel is going to contend with the results the first query established and tell routers and servers along the way.

There is a reason for multi tasking technology. One of them is the disparity between server buffers, disks and the bottlenecks of TCP traffic serving.

A better way would be to stream the data through multiple UDP connections, but that is a 10k solution. An even better one is to use P2P but there are a few billion dollar company to compete with who are spending tens of millions of such technology for rapid streaming delivery.

The most effective easy, safe and inexpensive way to multitask with large files is to:



a) safely and atomically isolate the capture in the most efficient manner

b) deliver the datastream to the line without any delay states (so direct serving while processing is extremely vulnerable to buffer breakage).

Tuesday, July 18, 2006

Welcome to Open Sauces.

Here we provide information about Open Source software and discuss these in an informal chatty manner. We are not interested in blinding you with technical detail but to provide the sauce that makes your day taste better!

We changed our minds. Open Source software deserves some technical writing, so that is what this blog will now be about - technical analysis that could save your hours. In depth blinding technical discussions or recommendations that we want you to discuss in our comments - disagreements are not useful - elucidation is the key. Open up the box and understand how to coral those electrons kind of stuff.