This language seems quite similar to Scallop [1], which was recently posted to HN [2]. Both are extensions of Datalog to arbitrary semirings, meaning that they generalise assigning merely true or false to relational statements, allowing probabilistic reasoning and more (almost arbitrary tagging of statements). Scallop is further focused on being differentiable and being able to integrate Scallop code into a PyTorch function. Both seem to have quite a bit of work put into them and have JIT compilers (Scallop also has a GPU implementation). I like the sound of "I have further modernized Dyna to support functional programming with lambda closures and embedded domain-specific languages." [3]
Same lineage (weighted/semiring logic programming for ML), but different system. Francis Landau’s work (Dyna) is a term rewriting implementation is of a weighted logic language with bag relational semantics, dynamic programming, and a tracing JIT. Scallop is a Datalog style neurosymbolic language built on provenance semirings with differentiable/relaxed semantics intended for e2e training with NNs. Consider Scallop a variation branch in the lineage optimized for differentiable neurosymbolic learning vs Dyna is a more general weighted logic programming framework with a different execution mode
Going to try it out.
[1] https://www.scallop-lang.org/
[2] https://news.ycombinator.com/item?id=43443640
[3] https://matthewfl.com/research#phd