Computer music at New Mexico Tech Around this time last year, my friend Eric Sewell said he was going to teach a computer music class at New Mexico Tech. I had been thinking about taking a class, probably a math class like linear algebra, but on a lark I decided this would probably be interesting. I’ve always loved music, but never really learned an instrument, and I thought, maybe if I involve the computer I can get further than if it’s about dexterity and years of tedious practice.
There is one official Toki Pona orthography, which is the one based on the lowercase Latin alphabet. There are two others: sitelen pona, the script, and sitelen sitelen, which looks like Mayan hieroglyphs. For instance, ale li jo e tenpo (everything has a time) renders in sitelen sitelen as the rather large: And this is rendered in sitelen pona as ale li jo e tenpo. Radicals in sitelen pona are an interesting idea.
Decreasing sequences It occurred to me that I could think of two ways to generate a decreasing sequence. The built-in i. will do this: i. _7 6 5 4 3 2 1 0 But I could see a way to do it with self-reference $: or with iterate ^:: (<:^:*^:a:) 7 7 6 5 4 3 2 1 0 This essentially says “iterate gathering results” ^:a: “while non-zero” ^:* “decrement” <:.
We can improve slightly on the Collatz examples of Chapter 10 of Learning J by noting that, while taking a function power of infinity produces the fixed-point, taking a function power of boxed infinity (or simply the empty box) also gives us back the intermediate values. Let’s start by bringing back the function itself: collatz =. -:`(1 + 3 * ])@.(2&|) To review briefly, we can treat f`g@.tas false`true @.
Let’s talk about ΩeΩ= 1. Here’s a great blackpenredpen video about calculating this value: This got me thinking about J, because it has some interesting “adverbs” for doing these sorts of calculations. For starters, there is a built-in numerical derivative operator D., and the power adverb, which has a note that giving an infinite power computes the limit. I think of it as a fixed-point operator: it repeatedly feeds the output back in as an input until the input and output are equal, which is a fixed point.
The following books have made an impression on me and my beliefs. Biblical Literacy by Rabbi Joseph Telushkin. This book gives you a synopsis of the bible, and Telushkin draws your attention to the moments and ideas that are important in Judaism. In a similar vein, How to Read the Jewish Bible by Marc Brettler, also surveys the content of the Bible and helps to put it into a modern Jewish perspective.
I’ve been investigating radio networking a bit lately for a side project I’m thinking about. This summarizes what I’ve learned. Different radio networking technologies in the hobbyist space have a fundamental tradeoff between range, bandwidth and power usage. Increasing range or bandwidth tends to increase power usage. So one has to ask questions like, how much data do you want to exchange on your network, at what rate, how far apart are they, and will your devices be powered.
How would you chop a linked list in half? A trivial approach would be to just get the length of the list and then walk the list building a first-half copy by tracking the indexes until you get to half that length. Thinking about it, I realized you could use the tortoise-and-hare approach from Floyd’s cycle detector to find the middle of the list: walk the list item-by-item and every-other-item at the same time; when the every-other-item list is exhausted, you’ve found the middle:
Stack Overflow is going through some kind of asshole midlife crisis and this blog post is the corresponding spiritual Mustang GT. Don’t be fatuous, Stack Overflow. Your culture is the way it is because of your rules. It’s not an accident that oh-my-stars we just peeked under the rock and discovered last week. Your rules created this monster. What is the overriding principle of Stack Overflow? It’s that questions (and answers) have differing value.
The architect at my work recently handed a prototype he build to me and I was instructed to maintain it. In so doing, I have found a nice little parable here about how things can go wrong and how it can spiral out of control. Also, let’s spit on Hibernate. At the outset, he chose to build a REST server with Hibernate. He went with RESTEasy, which antedates Jersey (JAX-RS/JSR-339) by a few years.
This is adapted from a comment I left on Hacker News. Why don’t we replace SQL with some other, better query language, perhaps something “equally powerful” like get name, age -> (foos.id X bars.foo_id) |> get first(name) |> head? Part of the point of SQL is that the database may choose to satisfy your query in a very different manner than you might expect given the procedural reading. The example query is a nice illustration; a modern relational database may move project/select steps earlier in the processing as long as it doesn’t change the declarative meaning.
I got one of these things for Christmas from my wife’s family secret santa: The way this thing works is pretty interesting. At the bottom are some coals. You put wood on those coals, and they smoke. At the top is your food. In between is a bowl of water. The water acts as a moderator for temperature and as a humidifier. So I’ve now smoked meat on this thing twice, in the dead of winter.
Reification is a concept that gets realized in programming in a handful of different ways. 1. Reification in RDF In RDF, reification takes claims like “John is a tool” and adds in the contextual information about who is making the claim, turning them into claims like “Dave says John is a tool.” This is useful because RDF documents don’t have to have authority statements within them; if you obtain one from me, you may want the effect of reading it to be something like “Daniel says contents of daniel.
In the previous article, the discussion of lambda calculus is a bit misleading. We’re not using lambda calculus terms as the intermediate representation. We’re being inspired by the lambda calculus and aping it as a mechanism for propagating information around in the intermediate representation. In Python, using the SQLAlchemy, you can construct conditions by using expressions like Users.t.name == 'bob'. The object in Users.t.name is some kind of column type, and it overloads __eq__ in such a way that when you evaluate it against ‘bob’, rather than returning a boolean truth value, it returns some kind of filter expression object.
Parsing simple sentences in Prolog, you get grammars that look kind of like this: sentence --> np, vp. np --> pronoun. np --> noun. vp --> iv. vp --> tv. pronoun --> [i]. noun --> [cheese]. iv --> [sit]. tv --> [like], np. Now with this, it’s possible to parse sentences as complex and interesting as “i like cheese” or “i sit”. Wow! Except, just accepting them is not very interesting:
Earlier tonight I was accosted by a stranger for failing to finish the excellent book Software Foundations: I think whoever recommended it to you had a common, bad idea… burnout with nothing to show Hear that Tyler? That time you alighted on a branch with cloudstuff in your beak and illuminated us, tugging on the golden thread separating meaning and method, brought down the lambda calculus, gave us all the Formal Languages curriculum our professor wouldn’t dare and couldn’t dream—apparently you had a bad idea and that’s why all your students “burned out with nothing to show for it.
All other things being equal (they’re not, but let’s pretend), APL will always have a cult of appreciation because it represents really the only time programming has said “we need our own alphabet.” But I don’t think APL symbols are really that evocative. Where does ⍳ come from? It looks kind of like a little i—it’s the index function, after all. i. seems just as legitimate and I can type that.
I realized recently that the way I have been handling answers to Prolog questions on Stack Overflow contravenes the rules. I don’t mind ignoring rules if it serves a higher purpose, but I have also come to realize that it actually is counterproductive. There is a constant flow of pre-beginner questions about Prolog on Stack Overflow. The questions I’m referring to typically sound like this: “my professor said a few incoherent things about Prolog and here’s my assignment and I don’t know where to start.
Every time I’ve spoken with someone about programming in the last month or so, machine learning or artificial intelligence has come up. Let me tell you the reasons why I don’t care a whole lot about it. 1. It’s not going to put me out of a job At least 75% of the actual job of programming is figuring out what a person means when they ask for something. This is not a problem that a machine can solve for you, for the same reason that it’s harder to build a robot that builds other robots than to just build a robot.
Here’s some Unix tools everybody should know about. They’re third-party and not especially well-known. fish fish - the friendly interactive shell “Finally, a shell for the 90s” is a great tag line. For a long time we have sort of had to decide if we wanted a better programming experience or a better interactive experience with the shell. Fish has a very nice programming language, but the ooh-ahh factor is definitely the command line.
Who is your hero? Mine is Cook Ting. Cook Ting laid down his knife and [said], “What I care about is the Way, which goes beyond skill. When I first began cutting up oxen, all I could see was the ox itself. After three years I no longer saw the whole ox. And now—now I go at it by spirit and don’t look with my eyes. Perception and understanding have come to a stop and spirit moves where it wants.
If I had to average a list of numbers, I would probably do it like this: averagelist(List, Avg) :- length(List, N), sumlist(List, Sum), Avg is Sum / N. This resembles the actual mathematical definition. Then you could just make a list of numbers and average that. @lurker is right, this is a terrible way to go, but it would work: average(N, Avg) :- findall(I, between(1, N, I), Is), averagelist(Is, Avg).
Natural Semantic Metalanguage is a theory that claims there is a common set of semantics underlying all natural languages. This is a descriptive theory, but we can also use it to evaluate constructed languages and perhaps use it prescriptively to help us create effective constructed languages (or at least let us restrict them consciously rather than accidentally). I’ve taken the chart of NSM semantic primes from 2016 and written the Toki Pona equivalent of each prime in it, and crossed it off a list of Toki Pona words.
I want you to close your eyes for a second and picture your biggest hero. Here’s mine. The man you see pictured here is Buckaroo Banzai. According to the highly informative documentary The Adventures of Buckaroo Banzai Across the 8th Dimension, Buckaroo is both a physicist and neurosurgeon while heading a rock band as well as running the fan club. One gets the sense these are just a few of the salient features of a fairly rich backstory.
Here’s a philosophical question about the “White Elephant” game: is it more likely that everyone will leave with a gift they like or that at least a few people will be miserable? If you don’t know the game, the basic rules are these: every person brings one present to the game. An order is determined for players to take turns. On each turn, you may either open a present or steal a present from someone else.
About the only social media achievement I am proud of in my life is that I am currently in the top ten of answerers on the Prolog tag of Stack Overflow. I’m especially proud of this because it’s something I’ve achieved by passion alone. I am not an especially talented Prolog programmer. Nor am I a particularly active Stack Overflow user. I don’t use Prolog professionally… in fact I tend to follow a “mama’s cooking” philosophy to it which is probably less than it deserves.
In case you missed it, the Go guys announced an official font today. Fonts and typography are like a weird little undercurrent in programming. Knuth famously took about a decade off from his massive Art of Computer Programming project to invent TeX and METAFONT. He did this because he found the second edition printing much uglier than the first and decided he needed to fix this problem. Knuth spent a decade mastering typesetting and typography.
Ho lee shit. I’ve been burning with anticipation for this album since they released “It’s Just Us Now” a couple months ago. You heard it right? This is an amazing song. At just the moment when it should go nuts, they draw back and create more room for Alexis to sing. Wow! And the video manages to reinforce it somehow. I’ve been listening to this song every other day since it came out, and generally pestering my friends about it.
A response to What’s an Engineer to Do? To fritter away a huge lead in the race. That’s what Apple’s doing. And Apple’s fans, like me, are so upset about it because we see the trajectory. Apple in 2002 was facing an uphill battle, but they were going against idiotic PC manufacturers that didn’t care about build quality or design aesthetics, and OS vendors that didn’t care much about developers or power users.
I’ve been worried about flour more than programming (outside work, anyway). Here’s everything I’ve discovered. Bread This has been my interest lately. The fundamental thing to know is that you’re after a certain ratio of about 5:3 flour to liquid. Beyond that you need some salt (about 1/2 tsp per cup, or more) and some yeast. The amount of yeast is mostly a function of when you want to bake; if you can let it rise for a day, you can use 1/4 or 1/2 tsp.
What makes Stack Overflow so successful? I was thinking about this the other day. What came before Stack Overflow? Mostly forums and mailing lists. And in a forum or a mailing list, what you get is a pile-up of data in basically chronological order. This turns out to be a difficult structure for asking and answering questions. So what Stack Overflow really did was this: Constrain the domain of possibilities to questions & answers Elaborate the types of communication that happen in that context, their operators and semantics You can talk about anything on a forum, or a mailing list.
Thus, programs must be written for people to read, and only incidentally for machines to execute. — SICP I have come to feel that this mindset is mostly hogwash outside academia. The principal utility of programs is their utility. This seems obvious but is overtly contradicted by the cliche above. The market for programs that cannot be executed (or are not primarily to be executed) is precisely the book market.
Why would one bother to learn a made-up language? Whenever I talk about constructed languages this question seems to come up. The most obvious reason is common to all (non-artistic) constructed languages: they have a smaller vocabulary and a more regular grammar, so they are all faster to learn and easier to use than any natural language. This actually helps you twice: first, because you gain a second language much faster than if you study a language far from your native tongue, and again, because going from language #2 to language #3 is much easier than going from language #1 to language #2.
The other day I answered a question on Stack Overflow that had to do with traversing a maze. Somewhat more idiomatically, the code would have looked like this: Before I explain what’s special about this, let’s reflect for a moment on Prolog. Paul Graham once observed: You often hear that programming languages are good because they provide abstraction. I think what we really like is not abstraction per se but brevity.
I’m a big fan of Haskell. Really big. It’s my other favorite language besides Prolog. And there can be no question that a significant aspect of Haskell is its expressive and powerful type system. In a sense, it’s the face that launched a thousand type systems. There was recently a bit of a row about dynamic and static languages (in my opinion this is the winning post) and I won’t rehash it here.
By “augmenting human intellect” we mean increasing the capability of a man to approach a complex problem situation, to gain comprehension to suit his particular needs, and to derive solutions to problems. Increased capability in this respect is taken to mean a mixture of the following: more-rapid comprehension, better comprehension, the possibility of gaining a useful degree of comprehension in a situation that previously was too complex, speedier solutions, better solutions, and the possibility of finding solutions to problems that before seemed insoluble.
Which language is more expressive, Lisp or Haskell? I want to encourage you to forget the word “expressive,” or at least, strive to give it a particular sense in your language advocacy, because it means different things in different contexts, often nuanced and slightly contradictory things. When Lisp is promoted as an expressive language, what is meant is that there is a complete macro system, and it is not difficult to write either abstract code or quite frankly machine-friendly, near-optimal code.
People often ask me which functional language they should study first. This is my “road map” to functional programming languages. Clojure is a very nice, highly pure functional language. It’s an untyped Lisp language, which means it emphasizes syntax macros, but it is just a really nicely designed language, very well-suited to data processing and gluing together Java libraries. Haskell is the flagship typed functional language. Lots of ideas in FP reach their high water mark in this language, which is famous for being difficult to learn and very mind expanding.
If you’re like me, you’re always in search of new ways to fetishize your books. I recently started keeping track of my books at all—a few too many of my expensive computer science tomes have walked away without my really knowing who has them. This has to stop, but solving this problem is boring and easy. You just go install Delicious Monster or BookPedia or something. If we want to make the task harder, and thus, more fun, we have to go further than that.
John Carmack says VR is going to change the world. That ain’t gonna happen. Here’s why: Zero mainstream non-gaming applications The PC revolutionized the world because every office with a typewriter or a calculator eventually switched to a PC. Nearly every office function is augmented or replaced by a computerized version. A word processor is manifestly better than a typewriter. The calculator became a spreadsheet. The centerpiece of the meeting is a Powerpoint presentation; several participants are remote.
This is an implementation of Intermediate Challenge #125, “Halt! It’s simulation time!” The crux of the problem comes down to a table of virtual machine instructions like this: I came up with the virtual machine abstraction, a 5-tuple machine with values for code, instruction pointer, instruction count, register values and whether or not the machine is halted. Then I defined some functions like get_register/3, set_register/4, set_instruction_pointer/3 which take a machine and some other arguments (a register number, for instance) and then return a value or a new machine with the appropriate change having been made.
First, Some Backstory When I first started getting serious about Unix around the turn of the century I got really hard into Vim. In 2003 I had already had my first glimpse of RSI so I switched to the Dvorak keyboard layout and bought myself a beautiful Kinesis Ergo keyboard. For some reason I decided this would be a good time to switch to Emacs, so I did. I never really achieved the fluence I had with Vim after switching from Emacs (though I did lose what I had with Vim).
The old ORM chestnut is back and we’re seeing the usual mixture of defenders and aggressors. But why is there such a divide? The argument is fundamentally about choosing what’s in charge of your system, the system being composed of your databases, your applications, and your supporting infrastructure (your scripts, your migrations, etc.) To relational database folk such as myself, the central authority is the database, and our principal interests are what ACID exists to provide: concurrent, isolated, atomic transactions that cannot be lost, on top of a well-defined schema with strong data validity guarantees.
The cat’s out of the bag; the reason why there are fewer women in computer science has finally gotten out. And what a predictable reason: rampant, uncontrolled, pervasive sexism. The only thing surprising to me about this is the sheer number of episodes. Something like four incidents leading to news stories on high-profile sites like Reddit and Hacker News. Pleasantly, there have been a lot of people saying the right things.
James Hague wrote a wonderful, vaguely controversial essay “Free your technical aesthetic from the 1970s." I’m refusing to comply, for two reasons: Those who forget the past are doomed to repeat it I hate unnecessary optimism I think James would agree with the notion that it is too easy to get complacent. Computing, after all, did not emerge from Babbage’s garage with an Aqua look ‘n feel. The road leading to today had a lot of bends, twists, blind drops and dead alleys.
If you took calculus in school, you probably remember it as a blurry mismash of baffling equations, difficult algebra, many seemingly irrelevant “real-world” examples and overall confusion. The blame for this is usually left squarely on the academic establishment, who are certainly at fault, but there is more to it than that. What is calculus? The word “calculus” is actually a generic word for a mathematical system. There are lots of other calculii; computer science has lambda, pi and join calculii.
The very first item in Dan Ingall’s esteemed Smalltalk introduction “Design Principles Behind Smalltalk” is this: Personal Mastery: If a system is to serve the creative spirit, it must be entirely comprehensible to a single individual. Most of the other points Dan makes in this essay are well-served by modern Smalltalks, and apart from a few glaring outliers, even by other modern systems that are not remotely connected to Smalltalk, but this one item in particular stands out to me as the most criminally absent.
Juxtaposition is an interesting operator. I’m aware of no language in which this operator can be redefined by the user. I’m referring to simple placement of two tokens next to each other, such as “x y.” In the vast majority of languages, this operation has no meaning at all. For example, in Python: >>> 2 + 3 5 >>> 2 3 File "<stdin>", line 1 2 3 ^ SyntaxError: invalid syntax In a few languages, strings juxtaposed are conjoined.
Inspired by James Hague and waiting for a new release of Cuis I thought I might sit down and noodle with J. See if I get any further than I did last time. So, I recently wrote a trivial Markov chain algorithm for Smalltalk. The code is pretty short, so I’ve included it at the bottom of the post. Rub your eyes on it after we talk about the J code.
I used to joke that I was the only programmer under 30 who knows SNOBOL. It may not be the case anymore, but there’s something fun about knowing a bunch of obscure languages. For example, a used Prolog book on Amazon goes for $20. A used Smalltalk book goes for about $10. A used SNOBOL book goes for about $1. For comparison, used books on functional programming usually seem to go for about $80.
In my experience, there are two extreme visions of reality embodied in two philosophies of programming language design. The first is Dijkstra’s view, the view embodied by Milner’s guarantee in ML, that once the program is compiled it cannot “go wrong” at runtime. This perspective treasures correctness and asserts that all wonderful things flow from it: reliability, readability, and so forth. Languages supporting this view of reality are primarily Haskell and ML.
To assist with job interviews at the NRAO we recently wrote a small “contest” program. Without giving away the details, the crux of the problem is to read a file with a list of scans and calculate the amount of time it takes to move the dishes and perform the scans, and report it. Candidates are required to write this in Java, but that restriction does not apply to me, so I of course had to write it in three languages that are not Java: Haskell, Common Lisp and Smalltalk.
I have been using Smalltalk, specifically Pharo Smalltalk, a derivative of Squeak, for some light experimentation with Seaside and proper object-oriented programming. I really do spend too much time being a philosopher of programming and not enough time actually programming. On the plus side, I view this with some derision as a fault. I realized last year or so that all the functional programming in the world wasn’t going to make me adequate at OO design, and I really haven’t advanced at that skill since the middle of college.
Today at work, while dealing with Hibernate, we noticed a queer problem that comes up when using one-to-many mappings. Like everything else in Hibernate, the system is apparently self-consistent, until it isn’t, and I usually find I can get about 90% through an explanation of why Hibernate requires something, but when I get to the last 10% I realize there’s a missing step in my understanding, or an unsound leap of reasoning in the software.