Suppose you touch a fireplace once, do you touch it again? No.
OK, here's something much stranger. Suppose you see your friend touch the fireplace, he recoils in pain. Do you touch it? No.
Hmm... whence statistics? There is no frequency association here, in either case. And in the second, even no experience of the fireplace.
The entire history of science is supposed to be about the failure of statistics to produce explanations. It is a great sin that we have allowed pseudosciences to flourish in which this lesson isnt even understood; and worse, to allow statistical showmen with their magic lanterns to preach on the scientific method. To a point where it seems, almost, science as an ideal has been completely lost.
The entire point was to throw away entirely our reliance on frequency and association -- this is ancient superstition. And instead, to explain the world by necessary mechanisms born of causal properties which interact in complex ways that can never uniquely reveal themselves by direct measurement.
Off the top of my head: It's easy to use, nice-looking, easy to create your own community (in seconds), feature-rich, and completely free.
The downside of things not being searchable/preserved long-term is not all that important to people. Obviously, it'll lead to a lot of information being lost, but I think people (particularly young people) have difficulty thinking long-term.
I remember coming across FRACTRAN long long ago on some esolang site and wondering how anyone could write anything in it to solve even the simplest of problems. What unlocked it form me was realizing that FRACTRAN programs can be thought of as a list of chemical reactions that are tried sequentially, where if any succeed we start back at the first reaction. Since fractions are reduced, there is an arbitrary constraint that the reactions have no catalysts, which are chemical species that are both reagents and products in the same reaction. Also, unlike real chemistry, reactions don't have to conserve mass/energy. (If you like, a FRACTRAN program is a list of rewrite rules in a commutative monoid, where a rule only applies if none of the previous can apply.)
With this in mind, you can write a program that takes two numbers, represented as the numbers of A and B species, and returns the sum as the number of C species like so:
A -> C
B -> C
Or, how about multiplication instead? We need some intermediate products U1, U2, and X to calculate this:
U1 + B -> U2 + C + X
U2 -> U1
U1 -> 0
X -> B
A -> U1
This does, roughly, "for each A, take each B and add it to C." Since adding something to something is destructive, it uses the X species to save the old value of B. The presence of U1 or U2 indicates the program is in this addition loop.
There is something wonderfully simple about FRACTRAN as a model for computation. Even Turing machines seem complicated in comparison.
By the way, Conway in his "FRACTRAN: a simple universal programming language for arithmetic" describes how to construct larger programs with control flow. From the above point of view, you essentially add a new species like LINE100 to the left-hand side of each reaction and a corresponding LINE101 to the right-hand side. In this case, the program would go from "line 100" to "line 101."
---
While there is an obvious way to write an interpreter for FRACTRAN, on anything but the simplest of programs it will run absurdly slowly due to the fact that everything is done with unary arithmetic (in the exponents of the primes). Is your input an ASCII-encoded string and you want to get the first character? You're going to need a loop that repeatedly subtracts 256 from the input to do modular reduction. Good luck parsing anything substantial before the universe dies!
Someone figured out how to accelerate these sorts of loops. It seems like it looks for constructs that add or subtract constants to the state variables, and then it skips over the intermediate steps and calculates their total effect. https://pimlu.github.io/fractran/ (It looks like it was a term project, and they have a paper about it in the GitHub repository.)
I think we need to be clearer by what "product" means when it comes to monopolies in the Information Age.
In the Industrial Age, you had a monopoly on paperclips if the only way to buy a paperclips was through your company. But paperclips themselves are commodity-like: there are billions of them out there and each is more or less interchangeable with the others.
Information products are not like paperclips. Each piece of information is by definition unique, and its value to the consumer is predicated on that uniqueness. When you buy a picture frame from a store, the first thing you do is throw out the little paper photo that's in it and replace it with yours. Why? The previous image was a picture of a smiling family. Isn't your goal with the product to have a framed photo of a smiling family? Why not just save yourself the trouble and keep the paper?
Well, it turns out that the fact that you want a smiling photo of your family is highly salient.
Sure, there are lots of social networks. If I want to find an app that has humans on it that I can connect with, there is definitely no monopoly. But if I want to find an app that lets me connect with my actual friends, then my choices are limited to exactly the social networks they actually use. If I want an app that doesn't just let me receive event invitations, but let's me receive the actual invitations my real friends send, I sure as hell better be on that one particular app. That app has an iron-clad complete monopoly on those events.
Almost every media or information company has thousands of micro-monopolies on various unique pieces of data. Our simple notion of trusts does not accommodate that concept. We need to update our thinking to the 21st century.
>I'd much rather mow my lawn with a mower than a pair of scissors.
I think you didn't go far enough with the implications of Carse's statement. It's not just that a machine mows your lawn quicker, it's that you have a homogenous lawn in the first place because it is a thing that can mechanically mowed.
For example, why instead of a lawn do more people not have a wild garden, with uneven terrain? Why did everyone feel compelled to put a green square, empty plot of land in front of their houses? is there anything interesting or alive in it, or does it exist because it can be mechanically operated? Given that mowing the lawn is something 'you want to get over with', is it not more accurate to say the lawnmower needs you shoving it around in a system that efficiently maximises lawn-mower production?
To understand the implication of what Carse is saying is to understand that machines don't just enter your environment, they shape your entire perception in a way that makes it conducive to be further operated by machines.
Is living in estranged suburbs with lawns really positive for human flourishing, or is it in a sense the logic of the car and the lawnmower operating on people rather than the other way around?
Is all the corn really part of a diverse diet and ecosystem, or have we adopted the diet because it is the thing that can be mass-produced?
Solidity has far worse problems than not being an advanced research language. Just being a sanely designed normal language would be a big step up. Solidity is so riddled with bizarre design errors it makes PHP 4 look like a work of genius.
A small sampling of the issues:
Everything is 256 bits wide, including the "byte" type. This means that whilst byte[] is valid syntax, it will take up 32x more space than you expect. Storage space is extremely limited in Solidity programs. You should use "bytes" instead which is an actual byte array. The native 256-bit wide primitive type is called "bytes32" but the actual 8-bit wide byte type is called "int8".
Strings. What can we say about this. There is a string type. It is useless. There is no support for string manipulation at all. String concatenation must be done by hand after casting to a byte array. Basics like indexOf() must also be written by hand or implementations copied into your program. To even learn the length of a string you must cast it to a byte array, but see above. In some versions of the Solidity compiler passing an empty string to a function would cause all arguments after that string to be silently corrupted.
There is no garbage collector. Dead allocations are never reclaimed, despite the scarcity of available memory space. There is also no manual memory management.
Solidity looks superficially like an object oriented language. There is a "this" keyword. However there are actually security-critical differences between "this.setX()" and "setX()" that can cause wrong results: https://github.com/ethereum/solidity/issues/583
Numbers. Despite being intended for financial applications like insurance, floating point is not supported. Integer operations can overflow, despite the underlying operation being interpreted and not implemented in hardware. There is no way to do overflow-checked operations: you need constructs like "require((balanceOf[_to] + _value) >= balanceOf[_to]);"
You can return statically sized arrays from functions, but not variably sized arrays.
For loops are completely broken. Solidity is meant to look like JavaScript but the literal 0 type-infers to byte, not int. Therefore "for (var i = 0; i < a.length; i ++) { a[i] = i; }" will enter an infinite loop if a[] is longer than 255 elements, because it will wrap around back to zero. This is despite the underlying VM using 256 bits to store this byte. You are just supposed to know this and write "uint" instead of "var".
Arrays. Array access syntax looks like C or Java, but array declaration syntax is written backwards: int8[][5] creates 5 dynamic arrays of bytes. Dynamically sized arrays work, in theory, but you cannot create multi-dimensional dynamic arrays. Because "string" is a byte array, that means "string[]" does not work.
The compiler is riddled with mis-compilation bugs, many of them security critical. The documentation helpfully includes a list of these bugs .... in JSON. The actual contents of the JSON is of course just strings meant to be read by humans. Here are some summaries of miscompile bugs:
In some situations, the optimizer replaces certain numbers in the code with routines that compute different numbers
Types shorter than 32 bytes are packed together into the same 32 byte storage slot, but storage writes always write 32 bytes. For some types, the higher order bytes were not cleaned properly, which made it sometimes possible to overwrite a variable in storage when writing to another one.
Dynamic allocation of an empty memory array caused an infinite loop and thus an exception
Access to array elements for arrays of types with less than 32 bytes did not correctly clean the higher order bits, causing corruption in other array elements.
As you can see the decision to build a virtual machine with that is natively 256-bit wide led to a huge number of bugs whereby reads or writes randomly corrupt memory.
Solidity/EVM is by far the worst programming environment I have ever encountered. It would be impossible to write even toy programs correctly in this language, yet it is literally called "Solidity" and used to program a financial system that manages hundreds of millions of dollars.
OK, here's something much stranger. Suppose you see your friend touch the fireplace, he recoils in pain. Do you touch it? No.
Hmm... whence statistics? There is no frequency association here, in either case. And in the second, even no experience of the fireplace.
The entire history of science is supposed to be about the failure of statistics to produce explanations. It is a great sin that we have allowed pseudosciences to flourish in which this lesson isnt even understood; and worse, to allow statistical showmen with their magic lanterns to preach on the scientific method. To a point where it seems, almost, science as an ideal has been completely lost.
The entire point was to throw away entirely our reliance on frequency and association -- this is ancient superstition. And instead, to explain the world by necessary mechanisms born of causal properties which interact in complex ways that can never uniquely reveal themselves by direct measurement.