Thursday, December 18, 2008
What will this do to theNet Neutrality arguments? Real-time communications can be buffered for only so long; a web page can be delayed for 500ms without too much complaint and an email can be delayed for 1000s of milliseconds without notice.
Thursday, December 11, 2008
But look at some of the most important, most influential, most profitable sites and you see terrible design and an irreverent approach to pixel-perfect design: IMDB is an industry standard but it's ugly as hell. Craigslist is ugly; and how about Google, eBay and Amazon? The beauty of those sites is in their functionality.
Sunday, November 30, 2008
For further reading:
Can Labor Implement Clean Feed
Filtering Pilot and ACMA Blacklist
Great Firewall of Australia
Sunday, November 23, 2008
During the second quarter of 2008, a typical U.S. mobile subscriber placed or received 204 phone calls each month. In comparison, the average mobile customer sent or received 357 text messages per month — a 450% increase over the number of text messages circulated monthly during the same period in 2006.
In U.S., SMS Text Messaging Tops Mobile Phone Calling
At this point text has not supplanted voice as one can't make a one-to-one comparison between text messages and voice phone calls. Still, one needn't be a statistical genius to see that if the rate of growth continues at the present rate that the data transmitted by text will shortly vastly surpass voice.
The Nielsen data shows that phone calls have leveled off at approximately 200/month with texts up to 350/month. Soon text/email msgs will be the primary means of communication and voice messages will become a vestige of the past. Yeah!! No more "call me msgs."
It must be pointed out that the Nielsen data pertains only to cell phone users and not to landlines. Is there still a use for landline phones? Maybe for some, maybe rotary phones are still useful for emergencies, but for day to day use land lines are going the way of answering machines, VCRs, and cassette decks.
What does this mean for you - the business customer? Stop putting out alpha-mnemonics based on the rotary/push-button phone. Soon no one is going to be able to use it - let alone remember that a 2 is ABC and that a 3 is DEF, etc...
Friday, October 24, 2008
Wednesday, October 8, 2008
Thursday, September 25, 2008
In general I agree with this rule of thumb. However, I disagree completely when it comes to external links via a blog. First and foremost blog readers are already a self-selected user group. It's true that there are always new users to blogs, who are unaware of the rules of blogging etiquette, but I think that few habitual blog readers are newbie computer/internet users. Now the above phrase is a HORRIBLE one for a IA person to throw out without some corroborating data. I must say I haven't any data so, if anyone can point me to it, I would be very happy.
My hypothesis, still to be backed up by empirical data, is that the user experience is superiour when links to external site opened in a new window. The reasoning is that it allows the user to experience, and to explore the new site without losing contact of the original source material. If the external source is not interesting it is simple to close out the window. If the new site is interesting one can easily click through 10, 20 or more times and would now have a more difficult task returning to the original page than simply closing out the new window.
Tuesday, September 23, 2008
The first priority of all site owners and IAs is to make the content available to as many people as possible. If people want to use Lynx – well, let them. It is a basic violation of web conventions to tell the user what browser to use; as bad as accessibility and usability violations. I remember when IE first came out and Microsoft was thought of as a monopolistic entity about to consume the world and many sites put up "Netscape Only" splash screens. It was a bad policy then and is bad policy today.
Additionally IE 6 still constitutes about 20% of the market. What sense does it make to eliminate 20% of your users from accessing your site. IE6 will be a force until corporate and government agencies upgrade to a new platform. Only when large sites which cater to corporate clients, like CNN, stop validating for IE6 can I see smaller sites joining on.
The only way I can see dropping support for IE6 making any sense is if your web app requires scripts not available in old browsers. However this is a business decision that should not be made lightly.
A better idea, should you want to drop IE6 support, is to develop to web standards; then check IE6 for any major breakdowns that prevent people from using your site. After the major issues have been cleared do not allocate any extra development time for minor items. At this point it would make sense to flag IE6 users, put a small banner at the top of the page saying that your site is no longer fully supporting IE 6 and ask people to “Upgrade to IE7 as this site may not function perfectly with older versions of IE.”
Saturday, September 20, 2008
One of the experiments shown on the program indicated that you will feel more positive about a newly met person if you're holding a warm beverage. Conversely if you're holding a cold beverage you will feel more negative about that person. (Does it also imply a website?) This resonated with me as I just went through a hiring process and have been very dissatisfied with the person selected. If we had been drinking Red Bull instead of coffee would we have come to a different decision?
Thursday, August 28, 2008
One should not be surprised that systems maintained by ISPs to provide traceability to ISP accounts become less precise once one is no longer using them for ISP purposes and start trying to trace back to actual people.
The authors stresses that not only would the traceability evidence have to be accurate enough to stand up in court but the prosecution would also have to account for numerous blocking and diversionary scenarios put in place "by the real" criminals and not necessarily the individual being charged with the crime.
If you're at all interested in the subject of internet privacy this paper is a worthwhile read.
Wednesday, August 27, 2008
Is it officially time to freak out in Apple’s general direction?
Not only does Apple remove applications from the App Store—I Am Rich was taken down yesterday, for example—but it’s now emerged that the company can remotely disable applications from individual iPhones.
Obviously, Apple should be able to protect itself and its users from malware - but should it have gotten rid of "I am Rich?" From a public relations perspective the answer is "hell yes;" and from a general business perspective the answer is "of course." Any business from a small mom and pop to Walmart can determine what is and what is not sold in their store. They may make a mistake in allowing a product in, surely they have the right to correct their mistake.
There may be times when a company bows to public pressure but is that necessarily a bad thing?
Tuesday, July 29, 2008
In time will purchased applications be removed or edited without your permission? Will newspapers, books and video be edited without your knowledge? The keeper of this knowledge (the company from which you download the information – and the government which has over site) can alter this data at whim. Without real care we can create a 1984 dystopian society.
I am very positive about the future of technology but we must take care to make certain that we keep our privacy, that the concept of people being “secure in their persons, houses, papers, and effects, against unreasonable searches and seizures” applies to on-line data as implied in the 4th Amendment. Computer companies (Microsoft, Apple) are now having their devices call “home” to detect piracy and malware. While I completely approve of this we must keep in mind potential problems that may come in the not too distant future.
Tuesday, July 1, 2008
Here’s the interesting aspect of this conversation – the client is an intelligent, educated man, reasonably technologically adept who has been using computers since the 1980s and has been on the internet for over 10 years. And yet – he thinks this way?
Here’s what Jakob Nielsen has to say:
Opening up new browser windows is like a vacuum cleaner sales person who starts a visit by emptying an ash tray on the customer's carpet. Don't pollute my screen with any more windows, thanks (particularly since current operating systems have miserable window management).
Designers open new browser windows on the theory that it keeps users on their site. But even disregarding the user-hostile message implied in taking over the user's machine, the strategy is self-defeating since it disables the Back button which is the normal way users return to previous sites. Users often don't notice that a new window has opened, especially if they are using a small monitor where the windows are maximized to fill up the screen. So a user who tries to return to the origin will be confused by a grayed out Back button.
Links that don't behave as expected undermine users' understanding of their own system. A link should be a simple hypertext reference that replaces the current page with new content. Users hate unwarranted pop-up windows. When they want the destination to appear in a new page, they can use their browser's "open in new window" command — assuming, of course, that the link is not a piece of code that interferes with the browser’s standard behavior.
The only exception to this would be when linking to .pdfs or .xls files or some sort of non-html file.
Designers and site owners should not think that it is their business to decide when users should have a new window. It's their computer, their decision to make. Furthermore the whole point of hyperlinking is the seemless linking from one point to the next. Opening up another window breaks the connection between sites.
Another reason, less important simply because opening new windows is a bad idea, but very important in the coming few years is the increasing use of cellphones with small screens. Mobile devices don't always support multiple windows and even when they do the user experience is quite limited in keeping track of the open windows.
Thursday, June 12, 2008
First, I felt that link from the email should contain the customer code; that way the customer wouldn’t have to go back to his email and enter the code. There was relatively no extra coding work so we could ease the customer experience at no cost to the client. The argument raised against this was the same raised by many retail operators – that the retailers count on customers not mailing in the rebate coupons; or forgetting to mention the coupon to the cashiers. The customers come to the store for the sale and end up purchasing the item at full price.
My second point was that customers seeing the coupon code, and the missed sale, would be alienated. It would be better for the coupon code to be invisible to new customers and instead display notices pointing out that if they signed up they could participate in future sales.
Unfortunately my points were overruled and a new clunky customer code system was developed.
The value in the tags come, if at all, in the higher clickthrough from the search pages.
Content Management Systems cannot write useful META and TITLE tags without input from the user. Google and the other search engines start to ignore the META and TITLE descriptions when too many pages display the same information. Your CMS can be modified to help in the creation of these descriptions based upon directory information and other information given to the system. But even the most well-thought out system requires someone to tweak the final result.
Wednesday, June 4, 2008
This willingness to add more functionality, especially third-party apps, gives me confidence that they will not go the way of Friendster. Looking at the popularity of Facebook's Travel apps; their willingness to accept 3rd Party apps; noticing that Google's PicasaWeb is superior to what Facebook has yet done; knowing that Google is becoming an App superstore; it's not unthinkable that the day will come when Facebook and Google will combine forces. Or one will swallow the other.
Thursday, May 29, 2008
Spend a little time to go to Twitter's site; open up an account; and get a feel of what exists.
You can search for a word or phrase at Summize.com and Terraminds.com.
You can look for people at Whoshouldifollow.com and Twitdir.com
You can also download programs and follow Twitter from your desktop. Take a look at:
Wednesday, May 28, 2008
Blogs are also part of on-line social networks but are different from the other social media sites in two ways. One person may have several blogs, each focusing upon a different topic but it rarely makes sense to have multiple Facebook or LinkedIn accounts. The second is the concept of time. In social media sites, such as Facebook and Twitter, it is the NOW that matters. Businesses may care to keep a record for legal purposes but most users rarely care about what was communicated 24 hours ago (if not 24 minutes ago). Whereas, with blogs, the continuity of posts matters. They are not focused upon the here and now.
All strategies used in attracting clients and consumers must focus first and foremost upon these two aspects, especially the concept of time.
Sunday, May 18, 2008
Search engines have problems indexing URLs that contain session IDs. I would only pass session IDs in the URL in areas of the site which are not to be indexed -- such as password protected areas or shopping cart pages.
Session IDs cause problems with the SE bots. The session variables are different each time the bot lands on a the "page," giving the impression that the page has a new URL every time it is visited. This appearnace of duplicate content causes numerous problems, simply put Session IDs must not be visible to search engine.
A second problem with Dynamic URLs come in parameter ordering. The coders must be careful to order the parameter the same way each time else the search engines will have to juggle which "url" to use to go to the same content.
All in all dynamic urls are fine as long as no session variables are used in indexed pages and if the coders are consistent with their parameter ordering.
Saturday, May 17, 2008
Monday, May 12, 2008
According to an analysis by Websense, the malware tries to exploit a total of eight security holes to pass malicious code to the visitors of the pages unnoticed, for example via the VML hole already patched in January 2007. F-Secure has monitored attackers who tried to break into .asp and .aspx web pages by submitting the page parameters in an encrypted SQL query:
DECLARE @T varchar(255)'@C varchar(255) DECLARE Table_Cursor CURSOR FOR select a.name'b.name from sysobjects a'syscolumns b where a.id=b.id and a.xtype='u' and (b.xtype=99 or b.xtype=35 or b[...]
Both code snippets are only the beginning of the request. Administrators of servers delivering .asp or .aspx pages (like Microsoft's IIS) are advised to check their log files for similar entries and if necessary search their databases for injected links.
Ah, the ingenuity of mankind. It is impressive isn't it?
Tuesday, April 22, 2008
I learned HTML that summer, and for me, my life changed.
Friday, April 18, 2008
It is possible to come up with another method aside from opening up a new window? Yes we can pass a parameter with the "original" page and add it to the navigation. I struggle to see how this is a better method. It would not be. Even expert users would get lost in an ever shifting navigational schema. Following the "open book" metaphor opening a new window, making it smaller than the standard window to stand apart is a clear and simple means of providing distinct data to the user.
And yet a year later I still see intelligent, highly educated, but unsophisticated computer users getting lost using tabs and not using tabs to organize their browsing. As of now Jakob Nielsen's warning regarding the user of new browser windows stands.
Friday, March 14, 2008
It's funny how often I've said, or heard, the phrase "80/20" and only now realize that the Pareto Principle was coined popularized by Joseph M. Juran. Pareto, I've just learned, was a Classical Liberal Economist of the 19th C who made the observation that twenty percent of the population owned eighty percent of the property and this truism held across time and culture.
The 80/20 rule may or may not be another "Golden Ratio" myth but it has certainly entered the lexicon.
Monday, March 10, 2008
"Consumers express a lot of concern about their privacy online in surveys. At the same time, very few engage in privacy-protecting activities," says Leslie Harris, executive director of the privacy advocacy group Center for Democracy and Technology. "There's a real inconsistency."
So, do users value the convenience of having a search bar beside their e-mail more than protecting their sensitive information? "There's a growing market pressure to do right by privacy," insists Harris. "But do I think the market alone will fix this? No. At the end of the day, we still need legal protection." Part of the solution, she says, may be a national privacy law.
The Privacy Paradox
I agree - part of the solution may be privacy laws. Facebook ought to have no more say over your data than does the journal in which you jot down notes.
Thursday, February 28, 2008
You come across substantial news or blog articles that you want to read, but don’t have time at the moment.
You need something to read while sitting on a bus, waiting in a line, or bored in front of a computer.
This is perfect for someone like me. My bookmarks are completely out of hand; I often scrape articles I want to read and when I most want to read them I am away from my computer.
My only fear is quickly bumping up against storage limits (both in space and time span). Here's hoping this site takes off.
Tuesday, February 26, 2008
Several years ago there were two META tags whose expressed purpose was to help you describe your site.
<meta name="description" content="Place Description Here" />
<meta name="keywords" content="Place Keywords Here" />
Developers and SEOs (Search Engine Optimizers) quickly started to take advantage of the tags. One of the more eggregious ways that people took advantage of these tags was by putting highly ranked search terms in the tags even though those terms had nothing whatsoever to do with the site in question.
The logic behind this was that people would find and see the site even though it had nothing to do with what the people were searching for. This may or may not have been effective for the sites who were abusing the system but it annoyed people who were using the search engines. It also annoyed the decision makers at Google and other search engines. The result was that these two meta tags became deprecated: search engines no longer valued the information in these tags.
Once again, anybody who says there are meta-tags that will help your site be rated highly is -- at best -- many years out of date.
Is there anything you can do to get your site ranked higher? Yes, there are many things that can be done but none of them include META tags.