Pages

Monday, April 12, 2010

Linking the Layers of Tech: Respecting the Developers and Whatnot

With Apple fending off Adobe and Twitter buying Tweetie, the community has been in an uproar over how companies are treating developers. This is because when you're in the social media and networking business, sometimes you lose sight of the fact that you are still operating your company on a platform that just thirty years ago could only be used, let alone programmed, by a unique set of individuals. In other words, the business layer of the company is so blinded by their own prospects that they forget the technology layer of the company. And the technology of a social networking corporation is very important, specifically because developers have their own set of morals and ethics to follow that companies usually ignore. Hopefully, by the end of this post, you or your company will have a better idea of what you're dealing with when it comes to the developer community, and what ethical decisions you may be butted up against. (And if you're a developer out there that runs his or her own company, hopefully the business has not absorbed you so much that you forget these principles.)

Hacking is a culture, not a business. If any company is ever to succeed, they must realize this. Many, and hopefully all, computer programmers actually have fun programming, and they do not do it just to make money or become famous. Because of this, hacking has become a culture, subject to the social expectations of its members rather than a victim of corporate profit. Any company that refuses to see this, and think that the platform underneath the Internet is just profit waiting to be drudged up, think again. I seriously consider naming the hacker community a separate nation from the rest of us. Anyway, with any culture comes a number of social standards and ethical expectations. This is where the water gets muddy.

Freedom is good. Developers are free spirits; when a hacker wants to solve a problem, they do not restrict themselves to a single platform, language, framework, etc. They do whatever is necessary to take the task, and complete it with the most ease and efficiency. Now I am not advocating that every program and service out there should be made open-source or something, that is a different type of freedom. But a developer should be allowed to use any and all resources possible to solve a problem. To restrict this freedom would be like telling a worker to hammer in a nail in with one of those plastic play-toy hammers. Sure the nail will eventually get hammered in, but it will take hours and the worker will be fed up with the task after thirty minutes. The other reason you cannot restrict freedom is diversity. When you have many developers, many platforms, and a single task, you will get hundreds of different answers, giving a better chance that one of those solutions is the perfect answer. Or would you rather force your developers to work on a single platform with a single set of tools, so they can all produce the same answer? Sure, Steve Jobs can say that the "intermediate levels" produce "sub-standard apps", but without these sub-standard apps, without these failed experiments, how can the good apps ever hope to be made in the first place.

Think twice, solve once. A derivation from the old measure twice, cut once philosophy, a problem should be thought over twice, and solved only once. The first part emphasizes that you should always look over a product and think about exactly what you are creating before actually setting your developers to work. Programmers should never be given repetitive, boring, uncreative, or stupid tasks. Programming is an art: it feeds off of creativity, so make sure not to give uncreative work. The second part of this rule, solve once, also has another meaning. A problem should never be solved more than once, unless the original solution was really bad or has a flaw. If something exists out there that can do the same exact thing as you are trying to create, there is no point. It is much easier and more efficient to build off of what already exists, and solve a new problem, than to reinvent the wheel. The only, and I mean only, reason you should ever solve a problem a second time is if you want to enter in competition against a company that has the same program, in which case you better know what you're doing.

A company should always keep these points in mind when creating their own services. Just look at Google: they have Innovation Time Off, which gives developers the freedom (rule 2) to solve new problems (rule 3) and interact with other hackers (rule 1). Now to get back to Apple and Twitter, who have been accused of violating these ethics. Apple is clearly in the wrong. They wall their developers in, removing freedom, removing culture, and forcing repetition. Twitter, on the other hand, has not violated any of these principles. All they have done is taken in problems that have already been solved (Tweetie), and made it their own, with proper compensation, of course. Why third-party developers are complaining is beyond me. Tweetie existed before Twitter bought it, so how would it existing after Twitter bought it have any effect on third-party services. If you think your mobile Twitter client is so good that you needed to solve a problem that was already solved, then it should be enough to compete with the Twitter brand. Otherwise, you should work on something new, take a problem that has not been solved before, and make it your own.

1 comment:

  1. Impressed with the quotation "Hacking is a culture..." Yes, indeed hacking is part of cyber-culture. If there was no hacking in IT history, I bet there wouldn't be much of progress on any field of computer programming.

    ReplyDelete