flwyd: (escher drawing hands)
Come, let us hasten to a higher plane,
Where dyads tread the fairy fields of Venn,
Their indices bedecked from one to n,
Commingled in an endless Markov chain!
Come, every frustum longs to be a cone,
And every vector dreams of matrices.
Hark to the gentle gradient of the breeze:
It whispers of a more ergodic zone.

In Riemann, Hilbert or in Banach space
Let superscripts and subscripts go their ways.
Our asymptotes no longer out of phase,
We shall encounter, counting, face to face.

I'll grant thee random access to my heart,
Thou'lt tell me all the constants of thy love;
And so we two shall all love's lemmas prove,
And in our bound partition never part.

For what did Cauchy know, or Christoffel,
Or Fourier, or any Boole or Euler,
Wielding their compasses, their pens and rulers,
Of thy supernal sinusoidal spell?

Cancel me not—for what then shall remain?
Abscissas, some mantissas, modules, modes,
A root or two, a torus and a node:
The inverse of my verse, a null domain.

Ellipse of bliss, converse, O lips divine!
The product of our scalars is defined!
Cyberiad draws nigh, and the skew mind
cuts capers like a happy haversine.

I see the eigenvalue in thine eye,
I hear the tender tensor in thy sigh.
Bernoulli would have been content to die,
Had he but known such a² cos 2φ!
Early in my experience with Unix-like systems I discovered fortune. This program would occasionally provide me with a clever passage attributed -- Stanislaw Lem, "Cyberiad" "Who is this Stanislaw Lem fellow and what is a Cyberiad," I wondered. And then, because it was the mid-90s and search engines didn't exist yet, I did nothing.

A few years later, I started collecting quotes to add to my random signature program. A great many of them came from fortune, since it gave me a quip every time I logged in or out. The first Cyberiad quote that made it on the list was [The brilliant Cerebron, attacking the problem analytically, discovered three distinct kinds of dragon: the mythical, the chimerical, and the purely hypothetical.] They were all, one might say, nonexistent, but each nonexisted in an entirely different way. Different modes of nonexistence, a fantastic puzzle for a philosophy minor like me. I wanted to find and read this book.

There are a few books and authors I keep in the back of my mind for eventual purchase. It gives me direction when I find myself in a bookstore: check the D section of Classics for The Vicomte de Bragelonne, check the A section of Sci-Fi for the HHGTTG radio series scripts, check the L section of Sci-Fi for Stanisław Lem… You would think it wouldn't be too hard to find a book by "the most widely read science fiction writer in the world," yet ten years went by without finding one of his books between Le Guin and Lewis. I was even a beta tester for Google Books on Android tablets, but couldn't buy an electronic Cyberiad. (It's available now, though.) Tantalizingly, Google ran a fantastic narrative doodle based on The Cyberiad. I finally found a copy when I chanced to stop in to Red Letter Books in Boulder, enticed by a book about mangoes on the shelf out front. "Before I buy this, I need to see if they happen to have any Lem." Sure enough, my Quixotic quest found its goal, wedged in a dense shelf of mass market paperbacks.

The Cyberiad is a book of short stories about machines who build machines. The central character is Trurl, a constructor. He and his good friend Klapaucius the constructor build all manner of robots and devices, often on commission from rulers of distant worlds. Unlike the science fiction school led by Asimov, the engineering details of the machines and their scientific mechanism of action are of little importance. The stories are not about the machines but about the philosophical considerations and allegorical implications of such a device in a world not entirely dissimilar from ours. The first story, How The World Was Saved concerns a machine that can create anything starting with N. After creating concrete and abstract nouns, they ask the machine to do Nothing, whereby it starts to eliminate the universe.

Originally written in Polish, the book has a lot of rhymes and wordplay with sciency terms which works surprisingly well in translation (to English, at least.) The sidebar to the right has a poem produced by Trurl's Electronic Bard. Lem has a great facility for technical naming in a way that's fun rather than dry: The second, newer trail was opened up by the Imperium Myrapoclean, whose turboservoslaves carved a tunnel six billion miles in length through the heart of the Great Glossaurontus itself.

What I like best about The Cyberiad is how it resonates with my experience as a constructor of sorts. The book was written in 1967, when hardware was still the king of technology, before we realized that software eats the world. Yet the story Trurl's Machine and other passages describe the foibles of building, debugging, and otherwise producing a computer program better than any software-focused essay I've read. Throughout the book, Trurl displays the three cardinal virtues of the programmer: laziness, impatience, and hubris. If more tales were added to the Cyberiad today, perhaps the constructors would be programs which write other programs.

All makers and builders and coders and creators would do well to read The Cyberiad: Fables for the Cybernetic Age. This hypermedia book report claims the book inspired Will Wright to create SimCity; what might it do for you? Acquire it in cybernetic digital form or via a musty-bookstore-quest for a well-loved copy not so overpriced as these.
flwyd: (dogcow moof!)
I'm working on code for a Burning Man art project. I opted to do it in Ruby because it's got Perl-like text parsing capabilities and it's fun to use. Despite being a little unfamiliar with the idioms, I've been able to write and improve code quickly, without getting bogged down in object factories, dependency injection, long-term reuse, or build systems that pull five dozen other projects.

Today I started generating graphs (the art part of the project). I'm using gnuplot and my goodness is it fun! The official documentation is a little hard to navigate, but the examples provide a quick path to graphs that, were I to attempt them in a spreadsheet, would result in much fist-shaking.

One sign of quality that both ruby and gnuplot display is that as you get familiar with them, you can often guess which words to use, even if you've never seen documentation explaining what they do. It's tough to hit the sweet spot between English and formalism, but these two do a pretty good job.
flwyd: (spiral staircase to heaven)
Musing about Apple's mistake of pissing off developers with the App Store, Paul Graham wonders what a smart phone would have to offer to make developers want it instead of an iPhone. He wants to see someone make a software development environment worth using on a handheld device. It would have to be good enough that you'd want to program on your phone, with its small screen and limited keyboard,[1] rather than on a laptop or desktop with huge monitors and ergonomic keyboards.[2]

After using Eclipse for five years, I can't imagine doing significant Java development (particularly debugging) on a handheld device. Mobile device user interfaces are all about presenting only the immediately relevant information while an IDE[3] is really good at providing a bunch of information at once.

I still use a 24x80 text editor when writing in scripting languages, partly because the problems are usually small enough to fit in just a few files and the languages are high-level enough that a significant amount of code fits in a terminal window. I can imagine writing Python or Ruby on an iPhone or even Lisp if the editor had good parenthesis-matching. Perl and Haskell might be frustratingly symbol-heavy. But just because someone could write Python in a text editor on a handheld device, I don't think they would. It's just not as enjoyable as using a full keyboard on your desk.

A successful development environment on a mobile device should make use of the unique features a handheld offers. Typing on a phone may be less fun than using a laptop, but what if you could program with gestures? Tilt your phone to the left to generate an assignment. Move it in a circle to create a loop. Touch a variable to reference it again. Use multi-touch zooming and scrolling to select a function from an API.

Historically, attempts at visual programming languages haven't yielded many success at lower abstractions than GUI layout and UML, but I think a graphical language designed from the ground up to fit a handheld environment could be quite elegant. Text is very good at representing symbolic computation, but it's not the only way. Constraints often lead to beautiful art; if the crutch of an easy keyboard wasn't available, I'll bet someone would come up with something really clever. A lot of programmers are good at spatial reasoning, so let's try breaking nonlinear programs out of the linear medium of text.

Developing on a handheld would obviously be handy when you're working on an application for that device. Make a change to the tilt response code, then tilt your device to test it. Take a picture[4] and feed it straight to a test case. With lots of sensor input, mobile devices seem like an ideal platform for neural nets and other fuzzy techniques. On the development side, you could present an example to the device and immediately classify it, quickly building a model.

These are all hand-wavy suggestions of possible directions. I've hardly ever used a cell phone, let alone developed a mobile app, so my conceptions may be off base. Yet I think my key message is valid: to make handheld development attractive, it needs to be approached from the ground up. The metadeveloper needs to focus on solving a class of problems[5] and find ways to do so using things small computing devices are really good at. I've focused on sensor input here, but mobile devices have other unique features. Would dictating pseudocode to your phone be useful? Maybe there are ways to take advantage of location information and remote networking in programming. In the end, I think successful handheld development will be as different from desktop IDE development as that is from simple interactive text editors as that is from programming with punchcards.


[1] Not owning a smart phone, I'm assuming that typing on a handheld device is more of a chore than using a full keyboard. I know Apple and others have done some clever things with predictive text, automatic spelling correction, and other clever techniques to make typing easier on a phone. Unmodified, I suspect those would not work well when programming where function names deliberately deviate from proper spelling and punctuation has very special meanings that differ between languages. You could build language models for programming languages, of course, but you could also implant such models in IDEs (most already do) and use a full keyboard. I think dropping the keyboard metaphor would be preferable for handheld input. Why not leverage the motion and orientation sensors and use arm and wrist motions to "paint" your words in thin air? This seems like it would work particularly well for Chinese. Nonetheless, I don't think it would be faster than using a desktop keyboard.

[2] I actually hate ergonomic keyboards; I'm most comfortable on a laptop or Apple keyboard. The most RSI-inducing computer input I've done is use the scroll wheel on a cheap Microsoft mouse. But the key (hah!) feature of an extended keyboard for programming in rich environments is hot keys, which are hard to accomplish with a keyboard interface like the iPhone's.

[3] For the non-programmers in the crowd, IDE stands for Interactive Development Environment. Here's a screenshot of the information overload I'm talking about.

[4] Why do phones come standard with (crappy) cameras these days? My camera can't make low-quality phone calls, I don't need my phone to take low-quality pictures. The benefit I see to cell phone cameras is (a) you've always got it on you and (b) it's immediately connected to software and the network. This makes it ideal for things like taking a picture of a barcode and comparison shopping instantly. The picture quality only has to be good enough to make out black and white lines.

[5] The best programming languages are the result of somebody thinking "There's got to be a better way" to solve a set of problems he was working on. Perl is really good at working with text because that's was the itch bugging Larry Wall. Dennis Ritchie created C to work on Unix and it's still the most popular language for writing operating systems. Guido adds new features to Python by trying out the syntax on a sample of existing code and deciding which produces the most readable result.
flwyd: (dogcow moof!)
More gems from 1986's Programmers At Work, this one from Butler Lampson:
That’s why I think the idea of computer literacy is such a rotten one. By computer literacy I mean learning to use the current generation of BASIC and word-processing programs. That has nothing to do with reality. It’s true that a lot of jobs now require BASIC programming, but the notion that BASIC is going to be fundamental to your ability to function in the information-processing society of the twenty-first century is complete balderdash. There probably won’t be any BASIC in the twenty-first century.

It's the 21st Century now, and the surviving BASIC dialect is Visual Basic, which is more different than mid-80s BASIC than it is alike. The heart of BASIC is to make it easy for people with a strong computer background to write programs. Depending on your perspective, this may be good or bad; BASIC and Visual Basic have been home to some truly groan-worthy code, but also let people accomplish many straightforward tasks more effectively. As the number of computer users has grown exponentially in the last few decades, the percentage of people who know a programming language has dropped significantly. In the 1970s, perhaps half of computer users in academic or research environments could write a program and most businesses that owned a computer had someone who could program it to some degree. Today, we've realized that programming well takes a style of thinking that doesn't come naturally to a lot of people in addition to an investment of time in understanding the ins and outs of specific systems. We've shown that it's more effective to have experts in programming learn new domains and write programs targeted to those than to have experts in domains learn how to program.

Lampson's bigger point is also insightful, but in a way it's wrong. It's true that the details of almost no program used widely in 1986 is relevant today[1]. The specific syntax of Microsoft BASIC, the keystroke shortcuts of WordPefect for DOS, and the location of hidden items in King's Quest are all irrelevant today. But folks like me who learned how to use computers before we learned how to drive have a cognitive model of computer interaction that's a lot more flexible and successful than folks in my parents' generation who get confused about the web and have no hope for social media. The medium is the message.

[1] Amusingly enough, this isn't as true for programmers. The C programming language, the vi and emacs text editors, and Unix-like operating systems have all evolved significantly in the last 25 years, but if you knew how to accomplish something back in the day, you can still do it now. Not to mention COBOL, the illness in the sleeper zombie army of legacy code.

Protip

Thursday, March 19th, 2009 03:35 pm
flwyd: (java logo)
If you're wondering why you have no data, it's entirely possible you haven't written that part of the code yet.

Ruby Soup

Thursday, November 20th, 2008 12:06 am
flwyd: (java logo)
I had a thought at work the other day. Java has a strict one-parent-class rule. Philosophically, this is a good match in domains like taxonomic biology (a phylum is in one kingdom, a genus is in one family). But it's a terrible match in domains like cooking. A soup inherits flavors from many sources. You can add meat to what was originally a vegetable stew. Ruby is a much better language for programming a curry.
flwyd: (java logo)
I created a fairly simple programming exercise for job candidates. It's not trivial, but it's not super hard. Someone who's written programs before and can figure out the solution should be able to write the program in two hours or less. The problem description says "Your program should read from standard input (stdin) and write to standard output (stdout). Sample input and output are available."

I'm rather dismayed by the number of submissions which don't read from standard input or output. The most common violation is using a hard-coded path like C:\input.txt (tip: I'm not running Windows, I don't have a C: drive; even if I did, I wouldn't have the sample input there). Other violations include requiring filenames on the command line (not terrible), prompting for all values interactively in such a way that I can't just run cat input.txt | program, and one submission in PL/SQL that hard-coded the sample input as a bunch of INSERT statements. Tip: I put "use stdin and stdout" in the instructions so that you didn't have to bother with all the file opening and closing details. Also, do they not teach students to run their programs before submitting them? Running a submission on input bundled with the problem shouldn't throw an error before producing any output. Maybe students don't know how to use a command line environment any more and I/O redirection is a foreign concept.

Do today's computer science students really not know what standard input and output are? Do they really have assignments that say "Read this file from C:\Homework\Problem1?" My hope was to create an evaluation script that ran several files through submitted programs and report a correct answer rate. But when correct programs are little more likely to read from stdin, I can't even write a script capable of getting an answer.

Punks.
flwyd: (daemon tux hexley)
My comment on [livejournal.com profile] tongodeon's insightful post about "elitist" charges:

Now that I think about it, George W. Bush embodies the three cardinal virtues of the programmer: Laziness, impatience, and hubris.

* He was a C student at Yale, made a big deal of how infrequently the Texas legislature met, and spent a lot of presidential time vacationing on his ranch even while Katrina loomed.
* He argued that Florida shouldn't recount every vote, lobbied for a second land war in Asia before finishing his first, invaded Iraq before weapons inspectors could finish their search, and declared the mission accomplished at least five years before the war's end.
* He governs as if he got a majority of the vote (the first time), routinely adds "I won't respect this part of this law" when signing bills, claimed to be in tune with God's will when invading another country (despite the position of the Pope and many other church heads), and cites executive privilege instead of letting his staff participate in congressional inquiries.

I don't think his regular expression skills are very good, though.


Also, Democracy Now's highlights from the DNC and RNC are each ten or so minutes well spent. Listen to the audio or watch the video; the transcript doesn't capture the whole feeling.
October 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 2017

Most Popular Tags

Expand Cut Tags

No cut tags

Subscribe

RSS Atom
Page generated Wednesday, October 18th, 2017 06:34 pm
Powered by Dreamwidth Studios