hckrnws
Hey, that's my project!
I just wrote another post about perf benchmarking it against Mathematica proper-- https://www.spakhm.com/ts-wolfram-bench. Really surprised by the result, on the workload I tested Mathematica is only 2x faster than my barely optimized interpreter. A testament to the V8 engine, I didn't quite realize how ridiculously good V8 is until running this benchmark.
that's awesome, I wrote a Mathematica clone too! I think it's one of the most rewarding projects I've done.
The amazing thing is how Mathematica starts working with just a few simple ideas, evaluator, backtracking pattern matcher, and a REPL.
were you able to implement Condition? how advanced is the pattern matcher? I got stuck after doing Blank BlankSequence and BlankNullSequence
https://github.com/anandijain/cas3.rs https://github.com/anandijain/cas8.rs
Very cool! Pattern matcher isn't very advanced, I only spent maybe two days on it. Only supports Blank, Pattern, and PatternTest. (Also doesn't handle Flat, so for example in Mathematica `Times[x_, y_]` will match `Times[a, b, c]`, but it currently doesn't in ts-wolfram.)
I'm new to Mathematica, so didn't know about `Condition` (thanks for mentioning it)
It was a little hard to get going, the parsing stage usually stops me because I never learned a good parser generator well enough to just start writing code. But once I got ts-parsec working, the rest was fairly easy. I think I got `D` to work within like two days. Was also very surprised how much you can do with so little!
Shoot me an email at coffeemug@gmail.com, let's chat more!
> Certainly Mathematica’s term rewrite loop is optimized to death, and I only spent an hour or two making the most basic optimization
I suspect this benchmarks begint libraries more than term rewriting. A way to test that may be:
bif[1] := 0
bif[2] := 0
bif[n_] := bif[n-2] + bif[n-1]
Timing[Do[bif[15], 1000]]
You can check that neither tool is smart enough to solve that to bif[n_] := 0
by comparing running times for different large limits.Fib(15) is just 610, so no big int involved.
As a quick double-check, fib(n)< 2^(n-1) and 2^14 is 16384.
This is not the Fibonoacci sequence because the first two terms are 0 and hence the entire sequence is 0.
Indeed, the Fibonacci sequence is in the original benchmark and Fib(15) is not benchmarking big integer performance. It should have the same characteristics as the always-zero function.
I think gp's point is that bif[n_] == 0 for all n_. A 'smart' optimser would recognise this and so the time to compute would be (a) constant irrespective of the value of n_, and (b) instantaneous because the function call can be re-rewritten as the constant value 0.
I'm curious what people use Mathematica for. I know some people who use it academically. Does anyone use it in a non-academic professional context? I'd be interested to hear from anyone who does, what they are using it for.
I have a non-commercial license that I use personally. The appeal of mathematica is that it has a library already built into it for just about everything which lets you quickly prototype and reason about fairly complex cross domain problems and then easily visualize them.
Mathematica unfortunately has a lot of warts and rough edges clustered in specific fields (low level cryptography and binary manipulation heavy algorithms) which makes working with it a pain in the ass if you touch those fields regularly but outside of those specific fields it's fantastic.
And I can claim praise doubly so if you get the System Modeler/Modelica license as well. With those two together you can model a project using analog circuits, digital circuits, software running on hardware/RTOS, pneumatics, hydraulics, CFD, multibody physics, etc all running in a unified environment. And then of course from a high level you can tune parameters that drive all the parts of your model and test out swapping in different parts or running under different conditions.
You can do all the stuff that Mathematica and System Modeler are good for with other software packages cobbled together but that takes a lot more time to do and the integrations tend to not be nearly as clean. The only other product package on the market that compares would be MathWork's MATLAB + Simulink. The main difference being that Mathwork's products are more comprehensive in what all they cover but they tend to be less pleasant to work with.
I use it regularly. Mostly for visualization and prototyping ideas. I’ve tried over the years to replace it with Python/Sage, but the giant library of builtin functions always pulls me back. Plus, the lisp-ish language I prefer over Python. The visualization capabilities are quite nice too - not just data or function plots, but the ability to construct relatively complex graphics programmatically is quite useful. On a recent project where I needed to build some data structures for spatial processing in 3d, I was able to build a nice 3d visualization to help me debug my code by visualizing the calculations I was doing. I also really like the backwards compatibility of notebooks: I have mathematica notebooks dating back to the 90s, many of which still work fine (modulo some visual ugliness due to changes they made to layout and presentation over the decades). Few other tools have that longevity.
In the company I work for, I was using Mathematica as a general scripting language, for designing algorithms, plotting data in 2D/3D, photo/video editing , and some occasional algebraic manipulations. Sadly the company did not renew the license because they needed to cut costs. They asked me to port all my tooling to other languages and tools: now I have a mixture of python, matplotlib/gnuplot, ruby, maxima, ffmpeg, imagemagik and octave. So far good but I preferred to have everything under one GUI/interface and Mathematica was giving me that.
Mathematica has thousand of functions, this seems to be just an algorithm for defining and using rules like that in mathematica.
>"Writing a toy differentiator turns out to be shockingly easy. It’s a near verbatim transcription of differentiation rules from any calculus textbook:
D[_?NumberQ, x_Symbol] = 0;
D[x_, x_Symbol] = 1;
D[Times[expr1_, expr2_], x_Symbol] = D[expr1, x] expr2 + D[expr2, x] expr1;
D[Plus[expr1_, expr2_], x_Symbol] = D[expr1, x] + D[expr2, x];
D[Sin[x_], x_Symbol] = Cos[x];
D[Cos[x_], x_Symbol] = -Sin[x];
D[f_Symbol[expr_], x_Symbol] := (D[f[x], x] /. x -> expr) * D[expr, x];
D[Power[expr_, p_Integer], x_Symbol] := p expr^(p - 1) * D[expr, x]; "
Absolutely brilliant! And simple! And terse! And brilliant!
So tempted to try this myself in another language - just so I can call in ‘tungsten’.
Tungsten oxide - for one in Rust.
Wrust? (O_3), more properly tungstic
I applaud calling the language Mathematica, despite Stephen Mathematica's attempts to rename it a couple of years ago.
See also: https://mathics.org/
It's a more mature and complete reimplementation of Mathematica, though still miles behind the original.
Mathematica is the application. Wolframlang is the language. They are both extremely impressive each in their way, but they aren the same.
Unfortunately when Wolfram hired Mathics' lead developer, the project pretty much stopped.
When was that?
https://github.com/Mathics3/mathics-core/graphs/contributors
The project seems to have had a decent level of contributions for the last couple of years.
The name of the program is ts-wolfram.
Probably just one of those cute auto generated names like "interestingalpacagoose"
Comment was deleted :(
Crafted by Rajat
Source Code