Is Everything Miscellaneous? It often feels that way

Do you have a hard time fitting your life into categories? Is it hard to separate work from play, office from home, partying from networking, the obviously relevant from the maybe someday relevant? If so, fear not, apparently you are not alone. For a start, there's me. I'm with you. For several years now it has been getting harder for me to categorize things. At first I thought it was a lack of mental discipline, or laziness, or maybe even the onset of old age.
(Quick, before I forget, an aside about old age and forgetfulness: I recently told my mother that I was concerned because, since I turned fifty, I seem to be forgetting more things. My mother, who is nearly eighty, replied: "Don't be silly, I used to forget loads of things when I was only twenty.)
But this category problem, this blurring of the lines, turns out to be a trend, a sign of the times, as described and discussed is the book Everything is Miscellaneous by David Weinberger, one of the authors of The Cluetrain Manifesto and a Harvard professor with a doctorate in philosophy (but a cheerful way of writing very accessible prose nonetheless). Here's some of the blurb from the book:
Human beings are information omnivores: we are constantly collecting, labeling, and organizing data. But today, the shift from the physical to the digital is mixing, burning, and ripping our lives apart. In the past, everything had its one place--the physical world demanded it--but now everything has its places: multiple categories, multiple shelves. Simply put, everything is suddenly miscellaneous.
And everything includes us. Or at least me. Think about it like this: Try answering the following three questions with a single word:

1. Where are you from?
2. What do you do?
3. Where do you work?

Some people can, but many cannot. My Dad could: Coventry/Engineer/Dunlop. I cannot. As regards question one: I was born and raised in England, but that included a spell in Canada and I have now lived in America for longer than I lived in England. I live in Florida now but also spend quite a bit of time in New York. I lived for more than five years in Scotland (which is different from England) and another five years in San Francisco (which is different from everywhere).

Question two: What I do is information security consulting, and privacy consulting, and film producing, and real estate development, but mainly what I do is write.

Question three: Where I do this stuff is all over. Mainly my office at home but sometimes at a client's office and basically anywhere there is power and bandwidth, which includes planes and trains and automobiles, which are not anywhere but somewhere between two wheres.

Don't get me wrong. I'm not saying my life is cooler than my Dad's. And I'm exaggerating a little to make my case. My Dad led a very interesting life, having served as an engineer in the Royal Navy during and after World War Two. He worked in Canada and 'the States' for several years before settling in at Dunlop in Coventry (but always as an engineer). And he was exploring new options (in engineering) when his life was tragically cut short at 50. However, I think you get my point. And his father could easily have supplied one word answers, as could my maternal grandfather.

But wait a minute, is this 'personal miscellanitude' merely or solely a result of things going digital? What about increased educational opportunities, fewer borders, greater social and physical mobility, cheap air fares? These have all played a part, as have changes in the workplace ethos, like big companies undermining job security and some of them screwing employees out of pensions (my mother still gets a widow's pension check every month from Dunlop but I know a number of people my age who have already lost pensions).

What I think is happening is that forces at play in the physical world are complementing the effect of digitalization. Infinite varieties of order, individualization of world view, these are possible in the digital world and they are reflected in the real world. If this sounds vaguely familiar from philosophy classes, think Hegel and his use of the term 'reflection.' The digital world initially reflects the physical but evolves according to its own internal reason. And the physical world takes on aspects of the digital, at least in our perception of the physical. It is at least worth considering that we are "being digital" when we feel like previously unrelated things in fact go well together or previously related things have no compelling reason to stay that way.
.

About Brad Pitt and You: Search engine trick barks up wrong tree

Pursuing my obsession with search engines [and myself] led me to enter my name into dogpile, self-described as "all the best search engines piled into one." In other words a so-called meta-search engine that pulls results from other search engines. What I found was quite interesting and applies to everyone, so you might want to try it. Go to dogpile.com and search for your exact name plus any other person, like Brad Pitt, or a place, or a thing. As an example, I put this in the search box:

"Fred Whassaname" gold

The first result from that search is a sponsored one. The second result from that search or any other search that follows the name/gold pattern, is a page at About.com that is headed "Gold Jewelry - How to Buy Gold Jewelry." The URL of this result is:
holidays.about.com/od/fashion/a/gold_jewelry.htm.

When you go to this page via the above search you will not find any mention of Fred in the text of the page, but if you search the source code of the page you will see an interesting trick at the bottom, an html IMG SCR tag that points to page at the New York Times, a page with the name in it:

http://up.nytimes.com/?d=1/&g=T&h=76NFF02820kA
012J&hs=76NFF02820kA012J&t=2&r=http%3a%2f%2fww
w%2edogpile%2ecom%2finfo%2edogpl%2fsearch%2fweb%2f
%25252522Fred%25252BWhassaname%25252522%25252Bgold
%2f1%2f%2d%2f1%2f%2d%2f%2d%2f%2d%2f1%2f%2d%2f%2d%2
f%2d%2f1%2f%2d%2f%2d%2f%2d%2f%2d%2f%2d%2f%2d%2f%2d
%2f%2d%2f%2d%2f%2d%2f%2d%2f%2d%2f%2d%2f%2d%2f%2d%2
f%2d%2f%2d%2f%2d%2f%2d%2f%2d%2f%2d%2f%2d%2f%2d%2f%
2d%2f%2d%2f%2d%2f%2d%2f%2d%2f%2d%2f%2d%2f%2d%2f%2d
%2f%2d%2f%2d%2f%2d%2f%2d%2f%2d%2f%2d%2f%2d%2f%2d%2
f%2d%2f417%2ftop%2f%2d%2f%2d%2f%2d%2f1&u=http%
3A%2F%2Fholidays%2eabout%2ecom%2fod%2ffashion%2fa%
2fgold%5fjewelry%2ehtm


In other words, the New York Times, which owns About.com, makes up pages on the fly, just to meet your search criteria. Making things up is not what one would expect from the New York Times, not after it got rid of those plagiarizing journalists. And one consequence of this nasty little search hack is that you can enter your name together with that of your favorite movie star and get a bunch of hits that appear to link you with that person. But it also means you can get a bunch of hits off:

"Fred Whassaname" felon

This raises the possibility that someone could conclude, if they just go by the number of hits, that poor Fred is a felon. There's no basis for this and somehow it just feels wrong.
.

Storm Over Missing White House Email: But will anyone be held accountable

A White House spokesman had stated that only a handful of people were using Republican Party email accounts to conduct government business, but the number has now risen to 88.
Over 50 [of these 88] have no email records at all and there are only 130 emails from Karl Rove during President Bush's first term and none before November 2003 [even though] the Presidential Records Act requires the recording of any communication used in governing [but Bush White House] officials bypassed this by using email accounts set up and run by the Republican Party.
So much for an "open society" any time soon.

In Corporations We Trust ? Not!

An interesting piece by Paul Brown in the New York Times today suggests that "maybe senior executives really do not have a clue." He reports that a study in the McKinsey Quarterly, the business journal of McKinsey & Company, found “a trust gap between consumers and global corporations, as well as a lack of understanding among business leaders about what consumers really expect from companies.”

As an example he cites the finding that, while 68 percent of executives said that large corporations made a “generally” or “somewhat” positive contribution to the public good, fewer than half (48 percent) of consumers worldwide agreed. The number was just 40 percent in the United States.

The study also found--no surprise here as far as I'm concerned--that executives were out of touch with people. For example, when asked what three concerns would be most important to them over the next five years, “Executives predicted consumers would put job losses and offshoring first, followed by privacy and data security, and the environment...[whereas]...almost half of the consumers picked environmental issues, followed by pension and other retirement benefits, and health care.”

I wonder how the average annual compensation of the 4,000 global business executives interviewed for the survey compared to that of the 4,000 consumers they failed to understand. So, when asked to rank different institutions in terms of being trusted to act in the best interest of society, consumers not surprisingly placed large global corporations below all the other choices, including nongovernmental organizations, small regional companies, the United Nations, labor unions, the media.

I bet those global corporate executives are crying all the way to the bank [off-shore no doubt, in the corporate Gulfstream probably].
...

Electric Ferrari? No, but this electric beat a Ferrari

As a lifelong EV fan I just love watching these two videos:

The electric car beats the Ferrari and the Porsche
The electric car beats the Lamborghini and the NASCAR

Even when you set aside the mega-geek factor and the bragging rights, I believe fast and powerful electric cars and trucks are the way to change the American perception of EVs for the better.

For the record, my first ride in an EV was 1971, before some readers of this page were born, and it was not a demo or a prototype. It was a commercial vehicle in daily use, a British milk delivery truck to be exact (you may have a hard time finding info specific to these EVs on the web unless you to know that the Brit term for them is "milk float"). Being a 'milkman' was a great way to earn money between high school and university and I was in good company (Sean Connery worked as a milkman in Edinburgh, although he drove a horse-drawn cart, not an electric 'float').

In techno-speak and biz-think, the role of the electric milk float meshes perfectly with the traditional characteristics of an electric vehicle. The range was 30 miles, plenty for the inner city delivery route I covered. The speed topped out at 30 mph, the highest speed limit of any of the roads on the route. The float pictured on the right is pretty much the same as the one I drove. It is even in the livery of the Unigate company, the same dairy I worked for, owned by food giant Unilever. The image is from the amazing milkfloats.org.uk web site. Amazing because yes, there is a whole web site devoted to these vehicles.

The awesome torque of electric motors was perfectly suited to getting a loaded truck off the mark and up to speed in a hurry. The crates back then were metal. The milk bottles were glass, and a full load of 750 Imperial pints weighed, well, it weighed a whole...a big...well a heck of a lot (if anyone happens to know how much, I'd love to hear from them). The point about the weight is, heavy loads are easy for an electric motor to handle (as most EV fans know, electric motors drive locomotives and cruises ships). Furthermore, the weight declined during the seven to eight hours that I spent dropping off full milk bottles and picking up empties, even as the batteries were being discharged. Back at the depot I would plug it in to recharge in overnight and it would be ready to go the next morning.

Remember folks, those EVs have been working like that, efficiently and pollution-free, since the 1960s. This was not a reaction to the oil crisis of the 1970s. What do you bet that more than 80 percent of all U.S. Postal Service delivery vehicles fit the 30/30 operational parameters of that old milk wagon? We could have had four decades of great gas-saving and emission-reduction from the postal service rather than a sweetheart deal for a petroleum-based government contractor (Grumman seems to make most of the postal vans I see in Florida--and I think the USPS ordered them in 1986).

China Sentences Former Drug Regulator to Death: Accountability indeed

China’s former top drug regulator has been sentenced to death for taking bribes to approve untested medicines. This was announced as the country’s main quality control agency started its first recall system targeting unsafe food products.

One can't help but wonder if there is something America can learn here about accountability. We see Bush appointees departng office, after making dismal and disastrous decisions, loaded with medals on their chests and cheered by pats on their backs. In contrast, the Chinese are executing a public official for taking bribes. A meaner spirited person than I might be tempted to wonder just how many people would be left alive in the White House if we applied the same rule here.

Money Handling Lags Behind Technology [in big chunks]

Finally, on 5/22, I got the payment through for cobb.com, sold 8 days earlier. As I suspected, the buyer has positioned the domain as a parking site. Whether or not the owners will now try to sell it to a "Cobb" business, I don't know. I contacted as many of those as I could ahead of the auction and they we obviously outbid by the new owner.

The length of time it took to complete the transaction was surprisingly long. What we have right now in the transaction field is a strange mix of models and technologies. Some transactions seem fast. For example, deposits to, and debit purchases from, my Bank of XXXX account seem almost immediate, although the 'posting cycle' may not match always match what you see when monitoring online. Paypal seems to happen fast and top-ups from my bank account are pretty quick.

But try moving a lot of money and things slow down. When you go from moving hundreds of dollars to shifting thousands, your choices start shrinking. At the same time, confidence in the system and trust in the customer seems to decline (as anyone who has heard the dreaded words come through the drive-thru speaker: "There'll be a hold on these funds").

As with all things commerce-related, it's all about trust and so far there is little evidence that the new forms of trust enabled by technology have outpaced the new forms of trust-abuse, a.k.a. fraud, that technology has engendered.

Anyway, cobb.com has gone, long live cobbsblog.com!

The Intuitive Interface Myth: The fault of gurus and experts

Okay, so I am officially fed up with the notion that graphical user interfaces are "intuitive" and "easy to use." There is nothing inherently intuitive or easy in a GUI. It all comes down to the design. Moving a mouse pointer over an icon and clicking it may look cool, may feel cool, but how easy is it for the average person? The answer depends on a variety of factors, like hand eye coordination and icon design. Half the time my screen has a bunch of icons on it the meaning of which is less than obvious. In other words, I have to learn what the icon means, I cannot simply intuit the meaning. Surely a word would be better? Yes, I know that you can turn on words for some icons, but this is inconsistent between applications and operating systems. And when you get to the web all bets are off. Some sites underline links, others don't. Some use rollovers, others don't. The same function is given different names on different sites, and so on and so forth.

How did we arrive at this situation, where computers and software are designed with interfaces that are non-obvious? Obstacles and not enablers? There are several parties to blame. Let's start with the industry giants and the wars between them that did not help (a great case study for MBA students--how the free market influences interface design--does the iPod dominate MP3 players due to interface? Did the windows wars between Apple and Microsoft help or hinder the interface evolution?).

Competition is great for some things, but when companies get fixated with one-upping the competition (in order to sell more product) there is a tendency to force software and hardware developers to add bells and whistles and do things different, even when an unadorned standard config is working fine. There is a whole book in this phenomenon, but consider one example, an interface issue that may well be the single greatest cause of lost productivity in the late nineties and early oughties (or whatever this current decade is called).

I'm talking about the way File Save works. Back in the old days, somewhere between the Pterodactyls and the 386 chip, it was "standard" for the File Save command to require confirmation, much the same way that the File Save As command does today. Suppose you had opened up the spreadsheet of weekly sales figures and updated them. When you selected File Save the spreadsheet application would ask you: Yes or No? The reason for this was obvious: You might want both versions of the spreadsheet, the one that you opened and the edited one . The latter might be very different. For example, the original might be the Megabank proposal which you had edited to become the Ultrabank proposal. You might have deleted a lot of information from the original on the way to the new version.

Obviously the File Save As command is for just such situations, but if there was one instruction that was drummed into the brains of early adopters of PC technology, back in the days when they were prone to disk crashes and brownouts and OS flakiness, it was this: Save now and save often. At that time, saving was not a destructive process. But it became one. And the Apple Mac was where it started. The Mac introduced "File Save with no overwrite confirmation." This meant you could have a problem if you opened a 10 page report, spent an hour re-writing the last 5 pages, hit File Save, then changed your mind about the changes. Even worse, open the document, perform Select All , Cut, File Save, and think about what happens if the machine hiccups before you Paste.

In all these scenarios there were workarounds that prevented them from being problematic, but they required a significant change in work flow. And for what? To make it easier to save work, a goal not necessarily accomplished without some hard lessons and tough data losses in the interim. Arguably things got worse when Microsoft Windows apps aped this style of File Save. (I well recall long distance arguments as a beta-tester with Borland as it struggled to choose the file save style for Quattro Pro--go with the new Excel/Mac "overwrite" style or stick with the traditional "confirm overwrite" style of Lotus 1-2-3.)

Windows aspired to be like Mac only different. That led to several File Save issues. One of the benefits of a graphical OS is the ability to convey more information in the same space. For example, an application could show if File Save was necessary by graying out and disabling the File Save command when the version of the document in memory was the same as that stored on the hard drive. But that feature has never been implemented consistently. That's a pity because it is really handy to know if changes have been made. Consider the task of editing a large image where the File Save command can take a long time to execute; performing unnecessary file saves in this situation is a real waste of time. The Canvas graphics program is one application that conforms to the "gray=saved" convention.

The current "saved" status of a document is particularly important when you are dealing with files that exist in two places, such as a web site you are editing locally before uploading. Fittingly, Dreamweaver MX is another app that uses the "gray=saved" convention.

I like the "gray=saved" convention but like a lot of interface conventions one cannot rely on it being there across apps or platforms. Why is this a problem? Because better and more consistent interfaces improve productivity and safety. We're all familiar with steering wheels. They allow us to jump behind the wheel of any car and navigate through traffic with a high level of expectation of success. They are a convention that car makers mess with at their peril, however much they want to "out-innovate" or "one-up" the competition. And we don't teach our kids to drive cars by telling them clockwise for starboard, anti-clockwise for port, because those are not the conventions used in driving cars. Port and starboard are for boats, where steering is sometimes a matter of push the tiller right to go left and so on. But in the early days of automobiles, some used tillers. Most people agree the wheel thing was a step forward and it has been the automotive interface standard for navigation for nearly a century. Maybe computers could use a similar period of interface standardization and stability.
.

Carter Calls Bush Worst in History: What's wrong with that?

So, former President Carter was quoted by the Arkansas Democrat-Gazette as saying the Bush administration "has been the worst in history." Fair enough. It is an opinion that I happen to share. But the Bush White House, still determined to undermine the cornerstone of the open society, namely criticism, expressed outrage and called Carter "increasingly irrelevant." Yeah right. The opinion of a former President and Nobel Prize winner is irrelevant. Sadly, Carter felt he had to back-pedal as reported in this story.

Personally, I don't hold with this whole "past presidents don't criticize sitting presidents" thing. After all, Reagan saw fit to criticize Clinton and took several "cheap" shots (remember "I may not be a Rhodes scholar but..."). Consider this:

Eisenhower was critical of John F. Kennedy's domestic policies, the first President Bush pounded on Bill Clinton, now his pal, for his Haiti policy, and Nixon chided the first President Bush (for comparing himself to Harry Truman in his 1992 re-election campaign). Theodore Roosevelt was brutal in his assaults on Taft and Woodrow Wilson. Media Matters