Worse is better

Worse is better

Post by Ibeam200 » Sat, 10 May 2008 13:57:43


came across this bit of programming philosophy recently and, to my
surprise, see that I have been using the "Worse is better" philosophy
as long as I can remember. Although the article does not explicitly
mention APL, the principles hold just the same. How does this fit
with some of the rapid prototyping methodologies?

The original article is in www.jwz.org/doc/worse-is-better.html

-------------------------------------------------------------------------------------------------------------------------------------


The Rise of ``Worse is Better''
By Richard Gabriel

I and just about every designer of Common Lisp and CLOS has had
extreme exposure to the MIT/Stanford style of design. The essence of
this style can be captured by the phrase ``the right thing.'' To such
a designer it is important to get all of the following characteristics
right:

* Simplicity-the design must be simple, both in implementation and
interface. It is more important for the interface to be simple than
the implementation.

* Correctness-the design must be correct in all observable
aspects. Incorrectness is simply not allowed.

* Consistency-the design must not be inconsistent. A design is
allowed to be slightly less simple and less complete to avoid
inconsistency. Consistency is as important as correctness.

* Completeness-the design must cover as many important situations
as is practical. All reasonably expected cases must be covered.
Simplicity is not allowed to overly reduce completeness.

I believe most people would agree that these are good characteristics.
I will call the use of this philosophy of design the ``MIT approach.''
Common Lisp (with CLOS) and Scheme represent the MIT approach to
design and implementation.

The worse-is-better philosophy is only slightly different:

* Simplicity-the design must be simple, both in implementation and
interface. It is more important for the implementation to be simple
than the interface. Simplicity is the most important consideration in
a design.

* Correctness-the design must be correct in all observable
aspects. It is slightly better to be simple than correct.

* Consistency-the design must not be overly inconsistent.
Consistency can be sacrificed for simplicity in some cases, but it is
better to drop those parts of the design that deal with less common
circumstances than to introduce either implementational complexity or
inconsistency.

* Completeness-the design must cover as many important situations
as is practical. All reasonably expected cases should be covered.
Completeness can be sacrificed in favor of any other quality. In fact,
completeness must sacrificed whenever implementation simplicity is
jeopardized. Consistency can be sacrificed to achieve completeness if
simplicity is retained; especially worthless is consistency of
interface.

Early Unix and C are examples of the use of this school of design, and
I will call the use of this design strategy the ``New Jersey
approach.'' I have intentionally caricatured the worse-is-better
philosophy to convince you that it is obviously a bad philosophy and
that the New Jersey approach is a bad approach.

However, I believe that worse-is-better, even in its strawman form,
has better survival characteristics than the-right-thing, and that the
New Jersey approach when used for software is a better approach than
the MIT approach.

Let me start out by retelling
 
 
 

Worse is better

Post by phil chast » Sat, 10 May 2008 16:58:52


a nice article -- thank you for posting it

I always liked the "good enough is perfect" maxim promulgated by the
early Unicists

the process of successive approximations being the basis of almost all
numerical routines, it has always surprised me that adopting the same
approach to system development should be greeted by near-hysteria in
some quarters

I'm not sure the principle is universally applicable, though -- you
wouldn't want to build a bridge that way (or, leastways, you wouldn't
want to use a bridge that you knew had been built that way, would you?)

/phil

 
 
 

Worse is better

Post by crisho » Thu, 15 May 2008 16:02:08

A good approach, in the general "Agile Methodologies" mode. I'm
feeling lazy, so a few quotations (let someone else do the thinking):

* Simplicity-the design must be simple, both in implementation and
interface.
# "Everything should be made as simple as possible, but not
simpler." -- Albert Einstein

* Correctness-the design must be correct in all observable aspects.
Incorrectness is simply not allowed.
# "Have no fear of perfection- you'll never reach it" --
Salvador Dali

* Consistency-the design must not be inconsistent.
# "A little inaccuracy sometimes saves tons of explanation" --
Saki

* Completeness-the design must cover as many important situations
as is practical.
# I was tempted with the extreme programmers YAGNI (You Ain't
Gonna Need It), but this has been around a bit longer - Sufficient
unto the day is the evil thereof." =- Matthew Chapter 6, verse 34

in summary "If a thing is worth doing, it is worth doing badly "-- G.
K. Chesterton

re-engage brain...

Chris
 
 
 

Worse is better

Post by Ibeam200 » Mon, 19 May 2008 05:58:28

> I'm not sure the principle is universally applicable, though -- you

True enough. When it comes to programming, I find that this bit of
logic from the other post (Holding a program in one's head) goes a
long way...


The "atoms" on the bottom need to be as "perfect" as possible. With
sensible program normalisation practices (something like it's okay to
duplicate code fragments in the name of simplicity, but don't
duplicate where something is stored) programs should remain simple
enough.
 
 
 

Worse is better

Post by Gosi » Mon, 19 May 2008 08:09:24


The data and how it is stored and presented is most important.
You have to think about how the data will best describe what you want
to accomplish.
A good structure of data makes all programming easier.
In APL the databases work best if you can use simple operations on big
pieces of data at the same time.
Most other languages operate on one byte at a time and then it can be
easy to forget about the correct structure of the data.
It is often better to code data and work with the codes during
operations and only use indexes and then only use the real data during
presentation.
As far as I have experienced then APL works best with integers.
ADI which is or at least was the most successful APL application I
know of worked that way.
It is a very exciting development to be able to use files as
variables.
Even make them look like matrixes.
No need to read them all into a workspace.
Create good directories and folder structures and set up files
correctly can save a lot of programming.