Let's talk about ΩeΩ = 1. Here's a great blackpenredpen video about calculating this value:
This got me thinking about J, because it has some interesting "adverbs" for doing these sorts of calculations. For starters, there is a built-in numerical derivative operator D., and the power adverb, which has a note that giving an infinite power computes the limit. I think of it as a fixed-point operator: it repeatedly feeds the output back in as an input until the input and output are equal, which is a fixed point. The power adverb turns out to be a hugely useful control construct in several ways—you can use it to describe various kinds of conditions as well as loops.
The first problem in J is figuring out how to express the function ΩeΩ - 1 = 0. A little help from
13 : produced this:
1 -~ ] * ^. As usual, you read J from right to left. Start with the fork
] * ^.
] is effectively
id for functional programmers.
^ is exponentiation; the dyadic form, such as
2^3 calculates 23, for instance, and the monadic form
^x computes ex. The fork trick being that the result of applying the left and the right are fed in as the left and right arguments to the center verb,
] * ^ is exactly xex.
1 -~ exploits a little trick about J, that constants can occur in certain positions of a fork or hook and act as values instead of functions, and because
~ flips the arguments of a verb,
1 -~ ... at the start of a train is a great way to signify "... - 1" in a formula. xex =
1 -~ ] * ^.
There turns out to be no need to determine the derivative, because we can use the
D. adverb. In fact, there is an essay about Newton's method on the J software site; I wound up having to change
D. for reasons I don't understand. In the end, we wind up with this adverb definition:
N =. 1 : '- u % u D. 1'. Applying this once looks like:
(1 -~ ] * ^) N ] 1 0.68394
Applying it two more times looks like:
(1 -~ ] * ^) N ] 0.68394 0.577455 (1 -~ ] * ^) N ] 0.57745 0.56723
But, we can just let the power adverb iterate it for us until we converge:
(1 -~ ] * ^) N^:_ ] 1 0.567143
Not enough precision? We can always ask for more:
0j50 ": (1 -~ ] * ^) N^:_ ] 1 0.56714329040978384011140178699861280620098114013672
In the end the whole program is these two lines:
N =. 1 : '- u % u D. 1 0j50 ": (1 -~ ] * ^) N^:_ ] 1 0.56714329040978384011140178699861280620098114013672
Curious how many iterations it took? If you give a boxed value to
^:, it will give you all the intermediates. So we can use
# to count them up and
<_ to produce an infinity in a box and find out:
# 0j50 ": (1x -~ ] * ^) N^:(<_) ] 1x 317
For six places, it only takes 6 tries:
# (1x -~ ] * ^) N^:(<_) ] 1x 6 (1x -~ ] * ^) N^:(<_) ] 1x 1 0.68394 0.577454 0.56723 0.567143 0.567143
There are a few differences between
d. in modern J. The version of Newton's method on the wiki page actually does work for this definition of ΩeΩ = 1:
<:@(] * ^). There is not a huge difference between this and
1 -~ ] * ^, just that in the original case we are performing an argument-swapped subtraction with 1. In the new case we are composing a decrement function. The derivative operator
d. is able to symbolically differentiate this definition, probably by simply discarding increment and decrement in composition since such things would fall off in general according to the power rule:
<:@(] * ^) d. 1 ^ + ] * ^ (] * ^) d. 1 ^ + ] * ^
This is just the fact that d/dx(xex - 1) = d/dx(xex) = xex + ex.
In short, the differences between
d.performs symbolic differentiation
D.performs numeric approximation of a derivative
D.has different rank
D.can be used to produce partial derivatives as well