If you consider Nature in general, nothing is good or bad. Colder is better for organisms that prefer cold, warmer for those that prefer warm. No life at all might be best for the geological (or should I say areological) wonders on Mars.
Ethically speaking, I think most climate change "alarmists" (what is the equivalent word with positive connotation?) are either worried for the species that will go extinct, or the poorer humans that will suffer the most from it.
The BSD license is developer-centered. It's awesome for developers, mostly avoiding any licensing headache.
The GPL is user-centered. It's not designed to be nice to developers, but to protect the user's freedoms.
So to a first approximation I expect that developers that think more of developer's freedoms favor the BSD license, while developers that think more of user freedoms favor the GPL.
> The GPL is user-centered. It's not designed to be nice to developers, but to protect the user's freedoms. *
* Unless those users are developers.
I still haven't figured out the distancing and discouraging of LGPL. A lot of software doesn't have much distinction between user and programmer. GPL is one of the worst licenses for a library.
we saw this throughout the Obama tenure as well, more executive orders...
Number of executive orders per president, per year in office[1]:
Theodore Roosevelt 144.7
William Howard Taft 181.0
Woodrow Wilson 225.4
Warren G. Harding 216.9
Calvin Coolidge 215.2
Herbert Hoover 242.0
Franklin D. Roosevelt 307.8
Harry S. Truman 116.7
Dwight D. Eisenhower 60.5
John F. Kennedy 75.4
Lyndon B. Johnson 62.9
Richard Nixon 62.3
Gerald Ford 69.1
Jimmy Carter 80.0
Ronald Reagan 47.6
George H. W. Bush 41.5
Bill Clinton 45.5
George W. Bush 36.4
Barack Obama 34.6
I think that if you looked at the scope of executive orders over time it would be instructive. Teddy Roosevelt executive orders were simple and limited.[1] Modern executive orders have very wide ranging consequences (such as allowing departments to share electronic surveillance without a warrant).
Examples include, "Authorizing Appointment of Translator in Bureau of Insular Affairs Without Examination," "Authorizing Reinstatement of Charles B. Terry as Clerk in Post Office Department Without Examination," "Amending Civil Service Rules to Except Commissioners of National Military Parks from Examination," etc.
> I think that if you looked at the scope of executive orders over time it would be instructive
Indeed, but instructive in an opposite direction IMO.
The sum of Obama's executive orders pale in comparison to the other Roosevelt's singular Executive Order 9066, for example. And that was hardly the only controversial FDR order.
Vietnam was really the first time that a war-time president didn't suspend the civil liberties of a crap-load of Americans.
The raw numbers are not really useful. There are a lot of EOs which do not push the bounds of presidential authority (awarding medals, ordering flags to half-mast, etc). They are not comparable to EOs which, for example, order the government to not enforce immigration law, or use torture.
No, a hypothesis was asserted, and quickly disproven by data. Without stronger data, the rest of these responses are called "backpedaling", no matter how positive your language might sound.
Yeah, the naked assertion of "more executive orders" is plainly false by the numbers. But, looking at GP's point more charitably, does that invalidate the spirit of the comment? I think it's clear that it does not. Instead, obviously we must consider the overall force of executive action in creating or changing policy, in order to evaluate whether the contents of this argument are supported by data.
And, FWIW, it's likely that the comment doesn't consider the absurd lengths to which executive power were pushed at various times in the first century of the republic, not only to create policy but also to eviscerate the decisions of the judicial branch (obviously "John Marshall has made his decision; now let him enforce it" comes to mind).
This is a rich and complex topics; moving the goalposts away from a discussion of the number of signed executive orders is quite sensible IMO.
Agreed. That is probably beyond the purview of a HN comment. If you are interested in the topic, there are a number of current political science researchers who focus on the powers of the presidency. Nancy Kassop (my thesis professor) might be an interesting author with whom to start - her work is precise yet easy-to-understand.
The content of the orders are probably just as important as the number. If not more so. But it's hard to measure the relevant stats in an objective way that satisfies both sides.
Obama took unilateral action with Presidential Memoranda instead of Executive Orders.
"Like executive orders, presidential memoranda don't require action by Congress. They have the same force of law as executive orders and often have consequences just as far-reaching. And some of the most significant actions of the Obama presidency have come not by executive order but by presidential memoranda."
Some of those numbers seem to be affected by significant world events (world wars, economic depressions). I wonder if filtering out some of the EOs specific to those kinds of things might smooth things out a bit?
All the data I have seen point to "altruism" [1] more than a war with Microsoft. Consider the list of Project Zero reports where the deadline was exceeded (thanks to brainfog for the URL): https://bugs.chromium.org/p/project-zero/issues/list?can=1&q...
[1] Certainly not literally altruism. I suppose Google thinks this projects benefits everybody including themselves.
> I suppose Google thinks this projects benefits everybody including themselves.
That's quite possible but once you start releasing unpatched vulnerabilities about competitor products there is at least a chance that 'including themselves' trumps 'everybody'.
So, google made a nice little 'hands-off' automatic disclosure feature which gives them a reason to say 'computer did it' but I don't think for an instant 'altruism' of any kind is the reason they do this.
If google really had the well-being of internet users at heart they'd shut down google analytics and stop accumulating profiles.
Until they do that my money is on Google estimating that they will do others more damage than they will do themselves through Project Zero and as such yes, we will all benefit but Google will benefit the most of all.
That's a false dichotomy, and it's also a very one dimensional take on what altruism means in the context of capitalism. There is a world of nuanced altruism between "has analytics that violate some peoples' privacy expectations" and "totally altruistic."
What's more, their analytics have nothing to do with the point at hand. Why even bring it up, except as an opportunity for a tangential soapbox? We're talking about disclosure timelines.
It has a lot to do with the point at hand because you can't really attack Microsoft on their web tracking because they don't do any.
So if Google wanted to attack MS on the subject of privacy they'd have to go all the way to Skype to get some traction. So instead they attack on a front where Google is strong and Microsoft slightly weaker.
Google is anything but altruistic, their each and every move is to improve the bottom line for Google and their shareholders. If something is really altruistic it likely falls in their PR budget.
There are much greater concerns with browser exploits than just privacy unless you're being really literal-minded and saying that someone getting their bank account emptied just had an extreme privacy violation.
- Naive, idiomatic and wrong use of the API automatically becomes naive, idiomatic and correct: Existing code that was using wall-clock time will now be using monotonic time when it should, and only when it should [1]
- No change to the memory footprint on 64-bit systems
- No change to the range of representable dates [2]
"No API change" means if for some reason it turns out to be a bad idea, they can still revert it and stay backward compatible (though of course, the documentation will mention how monotonic times are used to calculate better time differences and that would no longer be true after a revert).
Very impressed with the extensive survey of existing code which didn't find a single case where the change would cause an issue.
[1] Except when a user got out of their way to calculate a time difference in a non-idiomatic way.
[2] The range is only restricted when monotonic time information is present, which cannot sensibly be the case outside the restricted range.
On Unix, an empty file is a working implementation of the "true" command (provided that is in the PATH and set as executable). This is because it is interpreted as a shell script, and an empty script of course exits successfully.
I'm pretty sure this was actually used in some Unix/Linux version, and they got bug reports due to the poor performance (executing a shell to do nothing), which makes for a lot of bugs per line of code. Unfortunately I can't find a reference. Instead I found that AT&T Unix implemented "true" as an empty file... with a copyright notice! See http://trillian.mit.edu/~jc/humor/ATT_Copyright_true.html
IIRC you could resize the standard cmd console easily, just type "mode 160" (or whatever width you want). I don't have a Windows installation around to check it, but maybe someone can confirm?
All functionality used to reside in conhost.exe, no matter which character mode application you used (command prompt or PowerShell), which is why a lot of commands would work for both character modes. This is not how things work now (though a lot of commands are still shared, like “mode 120,120”) as conhost.exe now just decides if it should give you the legacy console host or the new one (with things like buffer improvements and word wrapping).
There are a lot of subtle improvements I think many are not aware of, like, if you paste in text with smart quotes they will be changed to straight quotes.
Indeed. But looking at the GC improvements in Go 1.8 ("typical" GC pauses of less than 100us), the set of cases where you can't use the language might be shrinking significantly. Now if only they could turn that in some sort of guarantee...
The sole performance metric of GC is not pause times! Throughput matters just as much if not more!
GC pauses are not the only reason to use Rust. Rust is not "little Go" that you reach for only if you don't want GC. You might want package management, data-race-free concurrency, a mature optimization framework, runtime-free operation, concurrent data structures, fast C interfacing, etc. etc.
You're right of course about the GC metrics. And there was some disappointing increase of GC CPU usage with the 1.8 changes (I don't now the current status). But some of the items you mention (package management, mature optimization framework) will hardly make cases where Go cannot be used, which was the original point. Same for concurrent data structures which can be implemented in Go, even if the lack of generics makes it less convenient. I do agree with you regarding the other items.
Aggressive compiler optimizations are not optional in many domains.
One rule of thumb that a lot of people don't realize is that if you aren't maxing out your sequential performance, your parallel multicore algorithm usually loses to an optimized sequential one. The reason is simple: parallelism introduces overhead, leading to guaranteed sublinear speedups. Compiler optimizations, on the other hand, frequently result in multiple factors of improvement.
I don't think this is right. The energy of a photon is h multiplied by the frequency, but the frequency itself can be anything. Even if the process creating the photon is inherently quantized (e.g. bound electron with quantized energy levels), we will measure slightly different frequencies (and so different energies), due for example to the Doppler effect[1]. Still with the Doppler effect, you can give a photon an arbitrary frequency (energy) by changing the velocity of the reference frame where you make the measurement (that is, until someone shows that the velocity of the reference frame is quantized... not something that is presently known).
Yes, that's correct. This does not mean that the energy levels of light (more generally, EM radiation) are quantized -- photons can have any energy level. What it means is that, for light of a given frequency, you can deposit energy only in units of h(nu).
The quantization of light shows up in how it interacts with particles -- even unbound particles like free electrons, which also do not have quantized energy levels. Specifically, if light were NOT quantized, you could get the same effect with more intense light that you get with more energetic light. Instead, experiments show again and again that longer-wavelength light at high intensity gives a totally different effect from short-wavelength light at low intensity. Postulating that light consists of particles (photons) with E = h(nu) explains this difference.
Ethically speaking, I think most climate change "alarmists" (what is the equivalent word with positive connotation?) are either worried for the species that will go extinct, or the poorer humans that will suffer the most from it.