2010 January

Advice: Don't cross-post unless you keep tabs on all your outlets

That title might seem awfully petty. This whole post might be petty. However, there is a particular habit that some of my online friends have made that irks me and brings up my own perceptions of how social media should work. This nasty habit is having your various social media accounts sync with one another while not paying attention to each of the networks where that content goes.

A simple example of this would be a Facebook update being posted to Twitter, but the user does not check on feedback from Twitter.

The whole idea of social media is to be social. One does not simply shout into the ether and not listen for a response back. The reason that you are on a social network is to keep in touch with friends, family, and colleagues. They may want to respond to whatever it is you said and their message may fall on deaf ears because you simply could not be bothered to check on your Twitter account.

I understand that people are often fickle with social networks. Look at Friendster. That was the social network to be on. Now it has faded into the mists of time. However, if you are going to give people the impression that you are active on a particular social network, then you really should make at effort to talk back to the people who try to communicate with you in those places.

There is my rant. I realize that this is very petty, but it irritates me to no end when people act like they are listening to me when they cannot even hear me.

How background applications could work on the iPhone

As with any product made by Apple, the iPhone’s interface and overall user experience could be improved in a few areas. Ars Technica has compiled a list of twelve features that they would like to see in iPhone OS 4.0. The final item on that list is multitasking and running multiple applications on the iPhone.

The reasons that Apple has given for not already including this feature on the first three iterations of the iPhone OS are perfectly valid. The battery life of the phone would suffer. There is limited memory. There are limited processor cycles. However, the iPhone seems to be directed towards technologically savvy consumers. One might reason that the majority of iPhone owners would be willing to suffer deteriorated battery life in order to get multitasking.

The usability issue with not having multitasking capabilities is that the user would have to leave an application in order to run another application. An example would be trying to send an email to schedule an appointment. One would have to leave Mail.app, open Calendar.app, get the desired information, and return to Mail.app to finish the email. This process may even have to be repeated several times in the course of writing that email.

Mac OS X has an excellent application switcher in the form of Cmd-Tab. There is no reason that some adaptation of this could not be ported to the iPhone. A shake gesture could be set up to call up the switcher. The double-press of the Home button could also be configured for this.

As for background applications, like an instant messenger application, those could be set up and enabled for background processes just like Apple has set up push notifications in iPhone OS 3.0. There is no need for an ugly Windows-style process manager. Just show a simple list of applications that have background capability and whether or not it is enabled. It could be as simple as that.

Apple has the ability to allow people who want multitasking and background processes to do it effectively. Hopefully, this feature will make an appearance in iPhone OS 4.0 sometime this year.

Be cool, stay in school (forever)

I love school. I cannot imagine not being in school, a reality that I will likely be faced with. To myself and some other people, there is no better place to be than in an educational institution. Why not stay there forever?

Even if you are not a teacher, you still get to shape the educational system in some way. If you are an academic advisor, you get to help students plan their academic, and possibly professional, careers. If you are a software developer or systems administrator within a university, you might be creating or supporting information systems that allow for efficient comunications between students and their instructors.

There are dozens of different positions within a major university that might involve never setting foot inside a classroom. However, being there would allow you to continue your own education for years, even decades with relative ease. Go to work, go to class, go home. You could do that for years. I know I would love to have that opportunity.

I do not see myself leaving the education system happily and would love to stay here and work. It is really the best place to be to shape the future of our society and to help ensure its continuity, before the young minds here become too old and stubborn to stray from their stupidity.

Tech companies should leave restrictive countries

Last week, Google broke the news that they had been hacked and that the attack originated from China. Since Google has been careful to not keep any of its servers or data in China, opting to store everything on servers in the United States, it is likely that the Chinese government itself is responsible for the attack. Google responded to this by announcing publicly that they would no longer be filtering content on Google.cn searches in compliance with Chinese laws. This essentially has ended Google’s corporate presence in China for the foreseeable future.

Maybe Google did the right thing here. It is not the only corporation that has submitted to the Chinese authorities in order to have a presence and marketshare there. As part of the Golden Shield Project, anything that is deemed inappropriate or subversive on the internet by the Chinese government is blocked. Since Facebook and Twitter are exceptionally hard to filter, those sites are blocked out entirely.

By leaving its 30% marketshare of the Chinese search market, Google essentially took a financial hit so that it could redeem its moral standing to a degree.

If internet and technology companies want to send a message to the entire world that they will stand up for free speech on the internet, then they should simply leave any country that uses internet censorship to repress its own people.

While China is fairly tech-savvy and its own native search engine, Baidu, would likely pick up the slack, the tech companies that leave would at least be able to keep themselve true to the spirit of the internet: an open, free forum for public discussion and communication.

IU School of Informatics forces laptop adoption

An email sent out by the IU School of Informatics Executive Associate Dean, Anthony Faiola, made the announcement that the School at IUPUI will be removing the computers from IT355 and IT 257 and making it mandatory for Informatics students to own their own laptops.

On the surface, this seems like a bad idea and would discriminate against students who cannot afford a laptop. If you are an Informatics or Media Arts & Sciences student however, you will need a computer at some point and it will probably be freshman year, at the latest. A modern, powerful computer simply is not an option in the fields that School of Informatics graduates are likely to enter post graduation. With laptop computers becoming as powerful as comparable desktops and battery life moving past six hours per charge, it would seem a natural choice to buy a computer that could be easily transported between home, work, school, and wherever else a New Media or Informatics undergraduate student would go.

As for the cost, this can be covered by financial aid. Again, this is still a worthy investment for any technology-related major to make. The financial aid is simply there to help cover some of the expense, just like with tuition.

The rationale stated for this change on the School of Informatics Laptop Initiative page is that this will allow the school to better maintain specialty, high-end equipment in a few Informatics computer classrooms. Whether there is some other explanation or reason for this change that has not been announced officially is the subject of speculation only. However, since the iMacs that will be removed were installed and purchased in 2008, it is possible that there are other reasons for this change.

It will be interesting to see how this program goes over. Students may also attend one of two open student forums scheduled on January 28th from 5:30-7 PM in the Faculty/Staff Lounge on the 1st floor of the IT building, and on January 29th from 2-3 PM in IT 252.

Intel announces "Arrandale" processor suitable for notebooks

At CES 2010 in Las Vegas this week, Intel announced its Core i5 microprocessor, code named “Arrandale.” This is the next generation processor and a step above the current Core 2 Duo that has been featured heavily in consumer computers since 2006. The Arrandale processor is manufactured using Intel’s new 32 nanometer production technology. It also supports hyperthreading, more commonly known as simultaneous multithreading. According to Intel’s website, this allows each core in a hyperthreading-capable processor of executing two simultaneous threads each.

The real news is that this new chip will be able to bring the Nehalem architecture to laptops, including the MacBook and MacBook Pro. The next fastest new processor after the Core i5 “Arrandale” is the Core i7 “Lynnfield,” a four-core chip. This means that we could be close to having a quad-core chip inside a notebook, albeit a probably expensive, high-end notebook.

Some benchmarks leaked a couple of days ago that compare the Core 2 Duo and Core i5 processors and the performance improvement is impressive. It shows an 11% to 30% speed increase compared to the Core 2 Duo processor, depending on which benchmark is being considered. According to these benchmark results, the overall speed improvement is 19.4% over the current processor.

According to Anandtech, there was no significant change in battery life. Given the significant performance boosts and no real decrease in power consumption, it looks like the Arrandale processor will be a welcome upgrade to all computer manufacturers, not just Apple, who are in the mid- to high-end laptop market.

It will be interesting to see what impact the Arrandale chip has on mobile computing. While likely it will be only an evolutionary step forward, rather than a revolutionary one, it will certainly be a welcome improvement.

Microsoft should start from scratch with Windows

Microsoft’s Windows is by far the most widely used operating system on the planet. It got there because clone manufacturers were able to undercut Apple and other Windows competitors in the early nineties on price and as a result, took over the corporate sector. People then started buying Windows PCs for their homes because that is what they used at work.

While Windows is the king of the hill, it still has some major flaws that have proven difficult for Microsoft to overcome. First is security. As the most popular OS, it is also the most popular target for people who write malware. These malware developers have success in attacking home-based systems because at the time, the common user was naïve about that stuff.

The second major issue with Windows is hardware compatibility. There are dozens of different hardware manufacturers. Each uses a different combination of processors, memory, graphics cards, and hard drives. This makes ensuring hardware compatibility virtually impossible for Microsoft.

There are other concerns, but the point is that Microsoft should start fresh with Windows in its next iteration. Since Windows 7 shipped last year, Windows 8 has probably been under development. Windows is the only major operating system left on the market that uses a proprietary kernel and not some flavor of Unix. This should change. Unix has proven itself to be a sturdy, reliable foundation for modern consumer operating systems.

With the advent of 64-bit microprocessors, Microsoft will have an opportunity to make a drastic change with Windows as 32-bit hardware is left to gather dust on the shelf.

How horror movies have become classist

I recently watched “The Hills Have Eyes” and noticed a couple messages in its sick, gruesome plot. Essentially, it is a story about a middle-class family that is attacked by mutants. It is a cheesy story, one that has been done to death, so to speak and not that interesting until one starts to consider the secondary messages imbedded in the film.

The obvious message, if this film was ever meant to have one beyond senseless violence and cannibalism, is that of environmental neglect, which led to the mutation of otherwise normal people. This leads to my main point, that the attackers are lower class people and the attackees are middle class. Essentially, there is a small class war between two social classes in which almost everyone involved are largely destroyed.

This is one of the main reasons I do not enjoy horror movies. They do not just leave me unsettled. They guilt-trip me.

Don't be a prick. Don't steal software.

If you are reading this, then you are on a computer or smartphone. Either of these devices will need software to be useful hardware. Do not ever take for granted the community of developers that creates software for the hardware platforms that you use on a daily basis.

I understand that there is a group of people who jailbreak their iPhones and iPod touches simply so they can put stolen iPhone applications on them. This makes me want to hit each of them in the face with a hammer. If someone charges a price for their software, then it is usually a fair price. Most of the paid iPhone apps are 99 cents. What exactly is the mechanism in your mind that would make you think that you simply cannot part with a dollar after spending at least $200 on the device it will run on?

Software developers should not be taken advantage of. They, like the rest of humanity, need to eat and live indoors. In order to do this, they require a steady flow of income from people who purchase and use their software. If they can no longer afford to do this, they will simply stop developing cool, useful software and everyone will suffer.

Much of the convenience we take for granted in our daily lives is built on the backs of people staring at glowing rectangles, spooling out reams and reams of code. They provide a vital service. If you steal indie-produced software, you are not stealing from some bi, faceless corporation. You are most likely stealing from the developer himself.

Don’t be a prick. Don’t steal software.

Dumb terminal redux

Currently, netbooks are being sold successfully. Essentially, they are small, low-priced, cramped, underpowered laptop computers that are designed for light internet and word processing duties. These remind me of the dumb terminals of the 60s and 70s. For those too young to remember, a dumb terminal is computer with only enough processing power to connect remotely to a more power computer. Back then, you would probably be connecting with a mainframe.

In addition cloud computing is becoming more and more popular. While still a buzzword, government agencies are getting set to adopt Google Docs for its document storage and collaboration. Clearly, working on a remote server is coming back into vogue, but for different reasons than the ones in the past.

In the past, there was incentive to work on remote computers because they were far more powerful than the more affordable computers of the day. People would get time on a mainframe and carry out complex calculations, often for research purposes. Now, remote computing is used for off-site backup as well as easy sharing and collaboration amongst several people.

With the rise of cloud computing and underpowered netbooks, could we be on the verge of seeing even lower-powered computers coming into the market? Could these computers have the explicit function of connecting to services like Google Apps and Gmail and having the user do all work on the internet, rather than storing data and executing software locally? The next few years should reveal that. There are some serious caveats to putting all faith in the cloud and keeping your data there. Google Apps and Gmail are great when they work, but sometimes they go down and disrupt life for millions of users who rely on them heavily already. What happens when you cannot access an important paper or get to your email?

Would you be willing to ditch your regular computer and move to 21st century dumb terminal?