Archive for the ‘Technology’ category

Add Journalists to the List of Things Unreliable

May 15, 2008

Apparently a reference was made to an iPhone-like tablet as the sort of device that could benefit from Intel’s new CPUs, and ZDNet took this as a claim that such a thing existed. Read about it at


Computers can’t even do math!?

May 15, 2008

Since the whole world by now should be aware of the general failure of artificial intelligence, it shouldn’t be very controversial to say that I’ve (mostly) always realized that computers can’t do many things that are very easy for people to do. Forget anything like thinking: the most basic identification of black shapes on a page as alphanumeric characters sometimes fails, and facial recognition seems to be a long way from being anywhere close to reliable.

But reading David Flanagan’s fabulous book “The Ruby Programming Language”, I discovered that computers can’t be trusted to do simple math either. Here’s the direct quote, followed by my simplified explanation.

Binary Floating-Point and Rounding Errors

Most computer hardware and most computer languages (including Ruby) approximate real numbers using a floating-point representation like Ruby’s Float class. For hardware efficiency, most floating-point representations are binary representations, which can exactly represent fractions like 1/2, 1/4, and 1/1024. Unfortunately, the fractions we use most commonly (especially when performing financial calculations) are 1/10, 1/100, 1/1000, and so on. Binary floating-point representations cannot exactly represent numbers as simple as 0.1.

Float objects have plenty of precision and can approximate 0.1 very well, but the fact that this number cannot be represented exactly leads to problems. Consider the following simple Ruby expression:

0.4 - 0.3 == 0.1 # Evaluates to false in most implementations

Because of rounding error, the difference between the approximations of 0.4 and 0.3 is not quite the same as the approximation of 0.1. This problem is not specific to Ruby: C, Java, JavaScript, and all languages that use IEEE-754 floating-point numbers suffer from it as well.

One solution to this problem is to use a decimal representation of real numbers rather than a binary representation. The BigDecimal class from Ruby’s standard library is one such representation. Arithmetic on BigDecimal objects is many times slower than arithmetic on Float values. It is fast enough for typical financial calculations, but not for scientific number crunching. Section 9.3.3 includes a short example of the use of the BigDecimal library.

So, in other words, if you ask what 0.4 – 0.3 is, you get the correct answer. But if you ask if 0.4 – 0.3 is equal to 0.1, you’re told that it’s not the same. And please remember: although Ruby is used in the example, this is not a bug in Ruby, but a generally accepted way for programming languages to work, which has been adopted by the international standards body IEEE.

Add to this the fact that different programming languages disagree what the remainder is when you divide a negative number by a postive number, and you’ll learn that you have to be very careful what you let computers do for you. They could even mess up the one thing that seems perfectly suited to them: basic math.

An alternative to Ruby on the iPhone

March 7, 2008

…from someone who says they’ve tried to do that:Programming Nu.

Here’s the description of the alternative, called “Nu”:

Nu is an interpreted object-oriented language. Its syntax comes from Lisp, but Nu is semantically closer to Ruby than Lisp. Nu is written in Objective-C and is designed to take full advantange of the Objective-C runtime and the many mature class libraries written in Objective-C. Nu code can fully interoperate with code written in Objective-C; messages can be sent to and from objects with no concern for whether those messages are implemented in Objective-C or Nu.

Nu currently requires Mac OS X version 10.5 or greater and runs on PowerPC and Intel systems. A Linux port is in progress; contact me directly or visit my blog for more details.

This is all from the guy that commented on Jason Fried’s prediction about the iPhone (where others raised the question of using Ruby for development):

Regarding Ruby and Objective-C:

I spent a lot of time looking at the combination of Ruby and Objective-C. Starting at the beginning of 2005, I built the RubyCocoa Resources site and later wrote my own bridge from scratch, RubyObjC. Eventually I decided that ultimately, the best way to put a scripting layer on Objective-C was to write one that was specifically designed for the job. Last year I did that, and this afternoon I got it working on the iPhone.)

If I understand this correctly, he’s more-or-less built a modified Ruby specifically designed for building apps on the iPhone. I’m not sure what challenges the new language overcomes where Ruby fails, perhaps in mapping to Objective-C concepts? Or why he chose to use a Lisp-like syntax: it would be much easier for Ruby programmers to adopt if it were both semantically and syntactically like Ruby.

A quick poke around his web site, however, renders answers to these questions.

The Future of the iPhone, according to Jason Fried

March 7, 2008

No discussion of developing with Ruby on the iPhone would be complete without mentioning Jason Fried’s take on the future of development on the iPhone:

What we saw today was the spark. The explosion will continue for twenty years. We will all feel the warmth…. just like there were a lot of players in the portable music space, there were no clear leaders. Until Apple came to town.¶ The same thing is happening today in the mobile space. Palm, Windows Mobile, Blackberry, Symbian. They’ve been players, but no one has broken out big. No one has managed to grab both the business and consumer markets like Windows did on the desktop. Until Apple came to town. At least that’s my prediction.

RubyCocoa on iPhone

March 7, 2008

I’m currently downloading (‘as we speak’–it’s 2.1 GIG) the iPhone SDK, after recently playing around with developing toy apps last night in RubyCocoa.

The idea is irresistable (if not slightly implausible) that it just might be possible to build iPhone apps with RubyCocoa. A quick Google search suggests that others are asking the same question, but no one seems to have an answer. But something on Apple’s site suggests that this might be possible:

The applications you create with Ruby and Python are packaged exactly like native Mac OS X applications. Your end-users will not be able to tell the difference. What’s more, Apple is committed to binary compatibility between releases of Mac OS X. This means you will no longer need to embed the runtime and language interpreter in your application.

So they’re saying that (for non-iPhone RubyCocoa development, at least) you don’t *need* to embed the (RubyCocoa) runtime and language interpreter in your application. This implies that it is possible to embed the runtime for RubyCocoa, and if it’s possible to do on a Mac, why not on an iPhone?

Does anyone know any reason that this would or would not be possible?

I coulda’ said “I told you so”…

March 6, 2008

I’ve been thinking for the last few weeks about the lack of Flash on the iPhone. More specifically, I’ve been thinking about the complaints about the lack of Flash on the iPhone. And I’ve been noticing how a single page with a loose Flash cannon can bring my 2.x GIGAhertz DUAL-CORE cpu computer to a grinding slow-down, and wondering if I really want Flash on my iPhone. I’ve been meaning to post that this is, in my opinion, the reason Apple (i.e. Steve Jobs) won’t give Flash the green light. Purely because Flash would destroy the iPhone experience, sucking down every available resource and still not performing acceptably. (And, oh, by the way, 99% of the Flash out there is freakin’ ads!)

Well, I didn’t post about it (until now) but my suspicions were more than confirmed by a recent posting on AppleInsider. Too bad. I could have shown off my brilliance if I’d posted this before them…

the irrationality of slamming tools that make hard things easy

February 20, 2008

Daniel at has an interesting post about a talk by photographer Ron deVries, who slams Photoshop as being a “toy”. Daniel (though he respects Ron as a photographer) goes on to explain why this sort of thinking is just nutso.

Here’s my take on it:

When people perceive their livelihood to be threatened, they tend to act in irrational (or rational but dishonest) ways. I would guess that the combination of two things led Ron deVries to say these things:

  1. he’s seen a lot of really cheezy work done in Photoshop, and
  2. he sees people able to gain the skill in hours or days to do digitally what took him weeks or years to learn to do optically and chemically. And they can perform these feats in seconds, while it still takes him minutes or hours.

The first item (if my speculation is correct) is what he uses to justify putting down Photoshop, and the second is what motivates him to put it down. His skills are not nearly so valuable if some upstart can catch up to him

I would agree that film has a quality to it (I can’t put my finger on it) that I’ve never seen in (my own) digital pictures. And I certainly prefer using my Olympus OM-1 (which forces me to do everything manually) to any digital camera I’ve used. It feels nicer, and I’m almost always happy with the results, although I can’t quantify how or why it is better, or even be certain that it is1.

This irrationality is not limited to Photoshop-bashing, however. I’ve seen contracting gigs for web development that said they wouldn’t take anyone who used Dreamweaver, as if everyone who uses Dreamweaver is a slave to the wysiwyg mode and can’t code HTML for themselves. Actually, Dreamweaver can be used as a pretty nice text editor with features particularly suited to web development. I personally use TextMate and Dreamweaver in tandem, and use the wysiwyg mode mainly to navigate to other parts of my code (especially with legacy table-based layouts, which can be a real pain otherwise). But if someone is able to get the right results with wysiwyg mode (that might not be possible in this case–I’m not sure, but for sake of argument let’s pretend it is) then who cares? It’s the results that matter, and any tool can be used in a multitude of ways. Just because a tool *is* used by uninspired hacks doesn’t make all of its users into uninspired hacks. And just because a tool lends itself to cheezy results doesn’t mean that a user with vision and taste can’t make that tool deliver something worthwhile. Judging a piece of work by the tool used is prejudice, pure and simple. We must let the results speak for themselves.

  1. The one exception is that I’d like to have auto-focus on my OM-1. I’d like to have the camera focus, and then let me adjust it if it I want before taking the picture.