It's perfect to declare the known facts, and let it find the best possible answers automatically. Not as strict as with prolog, but with loops. Much easier to work with and extremely fast. I create the facts automatically.
very interesting! i worked briefly for a failed cad startup, and the dwg format was the bane of our existences. (we didn't try to reverse engineer it ourselves, we relied on an unfortunately buggy library we purchased)
PySwip[1] makes SWI Prolog more accessible, by allowing much of a system to be written in more familiar python. Something analogous for Picat might be interesting, but seemingly stumbles a bit on lack of dynamic predicates.[2][3]
> it allows to directly convert the problem statement into an efficiently solvable declarative problem specification without inventing an imperative algorithm. -- Sergii Dymchenko
This quote from the front page reminds me of the motivation for Autograd (and other AD frameworks)
> just write down the loss function using a standard numerical library like Numpy, and Autograd will give you its gradient.
or even probabilistic programming languages like Stan, where you can write down a Bayesian model and get posterior samples.
Working backwards (as I know Stan but not Picat), I guess to really put the language to work you need to be aware of limits of the implementations, and how to dance around them.
Well, instead of a language pretending the abstractions are watertight, be transparent: present them as leaky or whatever. Pure functions are a good example of abstraction since you can always step in, see what’s happening and step out. Stricts are ok. Macros are ok. From then on, someone somewhere is hiding something and you have to figure it out. This happens at the CPU level as well.
I'm glad that logic programming isn't completely dead. I think once gradient descent has filled it's niches well and the AI summer quietens a little, we might see a return to knowledge engines and decision trees as a useful complement (if not a supplement) to solving the class of problems not well served by neural nets.
I think a discussion on Picat would be a little incomplete without one on Mercury. It combines pure functional programming with logic programming and is plenty fast.
New logical programming only languages have very few chance of adoption. What I would like to see more would be to enable logic programming related futures on current mainstream multi paradigm languages like in kotlin
https://github.com/Kotlin/KEEP/pull/199
Loops, optional destructive assignment, tabling (although some Prolog implementations also have tabling), better constraint programming support than many Prolog implementations.
With version 3.0 Picat is also mostly backward-compatible with Prolog.
One of the main differences is that Picat allows both deterministic functions (like in functional languages) and non-deterministic predicates (like in logic languages), which often simplifies code and makes for more efficient execution (although functions are translated into predicates).
In my opinion, Picat's biggest attraction is its facilities for constraint solving and optimization, which are in some respects state of the art.
It's perfect to declare the known facts, and let it find the best possible answers automatically. Not as strict as with prolog, but with loops. Much easier to work with and extremely fast. I create the facts automatically.