“Gangs of Lesbian Chimps Attack Malevolent Man Monkeys” might be the grocery store tabloid headline for this enlightening article.
In other primate news, a Union has hired temp workers to picket in front of a Las Vegas Wal-Mart, in 104 F heat, for $6 an hour, then refuses them union membership.
From the article:
“I asked him (union organizer Hornbrook), I said, ‘How come we’re working here for $6 an hour? I need you to help us find a better job. I want information on the union,’” Rivera said.
He was told, he says, to secure his own job with a grocery store, and then the union would help him to be sure the store paid him appropriate wages.
“This is an informational picket line only,” Hornbrook said. “We’re paying these people. They were out of work before (joining their picket lines). This is an in-between-jobs stop. Picketing isn’t a career.
Of course this has absolutely nothing to do with the fact that they don’t make enough money to cover union dues, right?
Had an idea for something patently patentable assuming it hasn’t already been and assuming software patents are ethical. BuzzMachine linked to my last business idea about the media business which effectively made it public domain but assuming I wasn’t going to actually start the business and that someone else will then it has a nice utilitarian effect.
So the idea is to use Ajax(think Google maps) to cache data on the fly while navigating through database driven websites. Here’s how it would work:
If you’re clicking the next record button to scroll through say client information the server would notice that you’re moving from A to Z (probably through awareness of the query) and while you’re reading about the Samsonite accout, Smith’s data would be quietly downloading in the background.
It’d have to be smarter than simple awareness of the query though. It’d have to recognize deeper patterns than just sequential queries. The problem being solved isn’t a lack of web server speed, it’s the latency of the internet. As applications move from the Win32 API and LANs to the web as a platform new bottlenecks emerge. Latency is that bottleneck for future applications.
There would need to be some kind of standard way to measure client latency and adjust the bandwidth to match. The tradeoff here is bandwidth. One extreme has the client downloading the whole database to kill latency which is of course ridiculously inefficient not to mention expensive. So that problem would need to be solved and has been in the world of the CPU through L1 and L2 caching algorithms.
The second problem is the issue of outdated cache data. If millions of people are using and updating the same data then cached data is going to get outdated. This is similar to problem one but could be fixed by a simple server side check when the user clicks next (which re-introduces the latency problem) but could be fixed by presenting relatively static data (based on field statistics) first followed by more recent data.
Last paragraph isn’t an elegant solution, this is better: The server could constantly ping the client with verification of cache reliability using UDP instead of TCP (the limits of HTTP are starting to become apparent here) to cut latency. You could also have a variable which lets you decide if you’d like to display data without 100% accuracy, data that would simply change as you looked at the page if more recent data was found in the database.