I don't understand the presoaking. In the source of the linked article, it says it takes 1.5 hours of presoaking compared to only 10 minutes when cooking in boiling water. So it seems that when you split heating and rehydrating, cooking time changes drastically.
The same idea, but somehow even worse. As well as the companies the law applies to having to pay people who have asked them to index and promote their websites...
a) The Minister will name a series of companies they deem the law will apply to, and it applies only to them. (Currently only Google & Facebook. The Minister recently spoke with Microsoft and decided not to add Bing to their hitlist.) The Minister can change this at any time, without Parliamentary oversight.
b) The companies it applies to have to both reveal the inner workings of their algorithms to selected news companies, and give 14 days notice of any changes that will happen.
c) The companies it applies to are not allowed to stop indexing or showing news content. They can't decide not to do business unless they stop being a business altogether.
d) The companies that will be paid under this scheme are selected by the Minister, and can change at any time without Parliamentary oversight. So it will have a net zero effect on independent journalism.
Yes, this sounds even worse. Letting the government decide which company has to pay is definitely not fair. The government could target some companies it does not like and let other pro-government companies out of this law. Honestly it looks as if this law was written directly by mass media lobbyists. It is unfortunate that this kind of anti-democratic regulation is being passed all over the world nowadays, it is as if corporate lobbying is stronger than the common interest.
Factually, it is more legible, maintainable, easy to extend. However, imo, this is mostly because all of the I/O had been abstracted away.
In the python program, I/O and logic were interspersed in a way that made it difficult to reason about only one or the other. You have to understand our data models to understand the code, and vice versa.
If I were to rewrite the python program today, it would "look" a lot like the Prolog one, and probably be equally as legible and maintainable.
I think it's an example where Prolog (as a "logic" language) "shepherds" you do the right thing with respect to I/O separation. I imagine if I tried writing it in Haskell or OCaml, the effect would be similar. I'm reminded of this article: https://news.ycombinator.com/item?id=22696229.
Python, meanwhile, is a blank canvas and a full set of paints, which is its great strength in experienced hands, but also its great downfall. When I leave the company, there's nothing stopping another engineer (a junior, or even an experienced engineer with different skill trees) from rewriting things in a way that "works" but compromises legibility or extensibility.
With discipline, you can develop a program in Python that looks quite similar to the Prolog one, but the only guardrails against writing it in spaghetti is psychological.
In any case, it was a fun exercise. It makes me want to try writing more things in different languages, if for no other reason than to improve my usage of the subset of Python I like best. :)
In my experience python gives the best results when you add a little bit of computer science to the mix. It is well within reason to use state machines[0] and constraint programming[1] libraries all from within python, no real need for an extra language.
Reading the wiki for Constraint Satisfaction Problems (CSP) brought back good memories of the time I decided to use Knuth’s Dancing Links and Algorithm X (DLX)[1] to solve sudoku (which apparently can be modeled as a CSP as well) for my Object Oriented class when I was still an Undergrad at University. I really wish I would have generalized my Algorithm X program (like Knuth did) to solve any of the possible exact cover constraint problems, but a lot of what was going on in his paper was unclear to me at the time and I didn’t see that he solved it more generally until later. It was really quite fun to finally see the program working and it was surprising how quick the program ran given that DLX is basically a brute force search backtracking algorithm.
I don’t know much of CSPs to know whether Knuth’s Dancing Links or his DLX was used all that often for them. I’m guessing an exact cover problem is one such CSP, but it seems that CSPs also use backtracking which it seemed to me that Dancing Links and sparse matrices were a good combo for that. Anyone have thoughts or inputs or corrections on whether what I said is correct about CSPs?
Yep, totally agree. Python's metaprogramming is powerful enough that you can twist it into any shape you like! :)
My favorite gem, especially before python3 and typehinting were more widely in-use, is attrs [0]. It allows you to define rich data types (which python is great for) in a declarative/expressive/legible manner (which python sometimes isn't great for).
imo, it's a fantastic example of how metaprogramming allows libraries to actually enhance a language. Another one that demonstrates the power of MP is fire[1], a tool that allows you to drive classes with a MP-generated CLI.
This is true regardless of language. There's a reason I insist on teaching what should be core CS topics to my new graduate coworkers (many of whom have been EEs with limited programming experience, their solutions tend to be brute force and unmaintainable for the first few years). These things simplify many programs and increase their clarity.
I meant to compare python with languages that support logic programming or functional programming which help convert the problem from loops and nested conditionals into declarative statements, but you are right, it makes their job, and in return your job much easier.
This. Thank you. Don't forget that there is no proof that plastic in oceans harms humans. On top of that, the infamous Great Pacific Garbage patch is actually made of fishing gear rather than of the despised plastic bottles.