Hacker News new | past | comments | ask | show | jobs | submit | ptrincr's comments login

This isn't strictly true.

The protocol ensures that there is on average a block produced every 10 minutes.

As hash rates increase, and blocks are found slightly faster, the difficulty is adjusted upwards to ensure that the 1 block per 10 minutes is maintained.

We've seen difficulty drop in the past, it doesn't necessarily rise forever. It only makes sense to increase when it's still profitable to mine at the current difficulty.

If the difficulty rises to a certain level and the price falls, and it becomes unprofitable to mine for some miners, they switch off their rigs and the difficulty adjusts downwards after a period of time to compensate.

Over the years we have seen the price rise and hash optimizations made, which have both driven the difficulty upwards.


It's not strictly true in the sense that if the price were to fall over long enough periods of time, you would expect the hash rate to eventually fall, too.

But that's not plausible in the scenario where the world's financial system eventually runs on a proof of work cryptocurrency.

Since all miners compete over the same finite profits, each miner individually has an incentive to increase their hash power and therefore power consumption.

Even if the price was on average constant, the game theory would predict a competition over finding the cheapest way to burn the maximum amount of power.

Empirically, there were some transient drops in hash rate for both bitcoin and ethereum, on top of a constant massive run up.


It's every 15 seconds for Ethereum blocks.

https://ethereum.org/en/developers/docs/blocks/


> This is really beginning to concern me. I think we’ve created a fake plastic economy built on face plastic house prices. By stoking the bubble, the purchase and renovation of housing keeps pushing money through the economy.

This has been an issue for a long time now. There should have been a crash in 2008/2009 but instead the government in the UK decided to prop up house prices with various schemes. I'm not saying this is right or wrong, for many people, house owners and people looking to purchase, this helped people stay out of negative equity/allowed people to purchase their own home.

> By constantly remortgaging, new value is “created” which allows people to have a new kitchen, go on holiday, buy a car, extend their house. But it feels like a trick to me. I’m not an economist, but where’s the wealth creation here?

I'm not sure remortgaging is creating new value. All money is loaned into existence, remortgaging is just creating new debt secured on your home. It still has to be paid back. People could still get loans, just not perhaps at the same favorable rate.

There has been a lot of money printing over the past 10 years or so, I think rising house prices are a natural result of that. I think it's more of a case of decreasing purchasing power of the £ in your pocket, rather than homes rising in value. Have a look at this chart when house prices are compared with gold[0]. It's also a function of the availability of loans, since now you can borrow cheaply over long periods, this increases the total amount people are able to borrow, as it's much easier to service the debt.

What's more surprising to me is that we're not really seeing inflation in other areas of the economy, such as wages or consumer goods. Inflation has been surprising low since the crisis of 2008/2009.

[0] http://pricedingold.com/uk-house-prices/


Was hoping someone would reply and convince me I was wrong rather than agree :-/

“Value” in quotes because it isn’t, as you said.

Regarding inflation: I think house price rises have been stoked by the stamp duty trick once again, but agree that there may also be a bit of inflation mixed in.

The reason we’re not seeing general inflation yet is the extra money supply has yet to leave bank accounts and move through the consumer economy. After things open up, we should see that money being spent and a short term boom, then as demand rises prices will respond and we’ll see inflation. At least according to Keynes, who in retrospect just seems to be more and more correct.

Wages won’t rise at the same pace because employers will exploit inflation, the trade unions are weak, and the govt doesn’t care - 120K dead and the Tories are seven points ahead, reality doesn’t matter in U.K. politics.


Getting a mortgage isn't purchasing a home, paying off a mortgage is purchasing a home. The bank owns your home when you get a mortgage, and they have the right to sell if you miss even your last payment.

If the housing market crashes 30% tomorrow it'll bankrupt everyone who signed on for a 30-year mortgage in the last 10 years, unless they've overpaid substantially. Very few leveraged first time buyers who bought since 2008 own their homes yet.


I use one for driving a ILI9341 TFT display.

It uses a 433mhz receiver and picks up temperatures from a couple of commercial temperature sensors, uses pygame to display them to the screen, plus a few bits of other info.

Pretty basic, but it works. It struggles with timings though, which I've discovered is pretty important when receiving and decoding 433 signals. Looking to use a Rasberry Pico instead shortly.


> It should be noted that although it was terribly designed the Commodore 1541 disk drive was by far the most common peripheral sold with the machine. It was fairly uncommon for C64 owners to load off of tape unlike other 8 bit systems

I think this was true for the American market.

However in Europe (and I believe in Australia as well) tape was the main medium people would have used. Certainly when the C64 was being marketed towards children as late as the early 90s, it was being sold as a cheap games machine and came with the trusty datasette.

I don't think that it was until the Amiga 500(and of course the Atari st) where the disk format really took off for the home market this side of the pond.


I've not read Diaspora, but I have read Schild's Ladder. I found it difficult to follow in parts ( the science explanations ) but some of the themes it touches on are absolutely fascinating. Fans of hard science fiction novels will enjoy.


I first heard of openstep from this brilliant post yesterday https://news.ycombinator.com/item?id=24091588 ... and here it pops up again.


Not just OpenStep, but Rhapsody! It was Apple's original transition plan for Mac OS X, basically OpenStep/Mach 5.x with a Mac OS 8-style Platinum UI done in NeXT-style Display PostScript complete with draggable menus. It's my favorite Apple product of all time just for the "what could have been" factor, but the Adobes of the world balked at "rewrite your apps in Yellow Box" so we got Carbon and friends in the rebooted Mac OS X project. There was one retail release as "Mac OS X Server 1.0", but the "Premier" desktop-user-focused version was canceled. I like to run Rhapsody on my Power Mac G3 Blue&White because that particular machine shipped with Rhapsody (in a Server G3 configuration) and is extremely well supported: https://cooltrainer.org/images/original/michiru-rhapsody-201...

And here are some good starting points for more info:

http://www.roughlydrafted.com/RD/RDM.Tech.Q1.07/4B800F78-0F7...

https://en.wikipedia.org/wiki/Rhapsody_(operating_system)

http://rhapsodyos.org/


I had no idea Rhapsody OS existed! I really want to give the Mac OS Server 1.0 a spin on a PPC machine!


It was actually fun to use. I had NeXTSTEP 3.3 and then 4.x with the developer tools back in 95. I was in love with PostScript and Objective-C so this was my favorite computer. I had it running on a custom Pentium 90mhz system. It was so easy to process data on that thing.


I agree! Circa 1998, I got a tip from a friend in university IT and "rescued" a few NeXTstations and associated peripherals (displays, mice, keyboards) from an Indiana University dumpster. I emailed NeXT, and they were nice enough to send me complete software kits for both NeXTSTEP 3.3 and OPENSTEP 4.2 (all platforms) at no charge.

The monochrome displays, in particular, were lovely, and, in spite of its outdated CPU (25MHz 040 IIRC), one of these machines, running NeXTSTEP 3.3, was my primary workstation for terminal-centric work for the next couple years.

While I never did get much into Objective-C beyond the basics, as a big PostScript fan since the red/green/blue book days, I did have a bit of fun with Display PostScript (and NeWS on the SPARCstation 1 the NeXTstation ultimately shared a desk with, although, quite unlike the NeXTstation, the Sun was nearly unusably slow for day-to-day GUI use).


Similar story. I was walking through the Physics building at the University of Maryland in the summer of 2004 and a brilliant looking black cube of a computer caught my eye. It was lined up along a wall with a bunch of drab, old computer equipment destined for the trash heap. This was before the era of web-connected phones, so I quickly peeked at the back sticker for some information (model, serial, etc.) and went back to my dorm and looked up what it could be. It was a NeXTcube, which I had never known about. So I hustled back with an empty backpack and managed to cram it in with the zippers barely holding it in and walked back to my dorm. Once I got back and took it apart and was amazed at the quality. I never booted it up, but I kept it for a year or so before selling it on eBay to a person named Chuck (vintagecomputermuseum) for $300. I really should check-in with them and see if they ever got it up and running.


Circa 98 that would have been Apple, which is interesting but not unbelievable at all.


There is an interesting answer here with regards to how the Pauli Exclusion principle is related to magnetism.

https://physics.stackexchange.com/a/246439

As I understand it, with my limited knowledge, deeper down it involves the exchange of virtual photons between electrons. I can see how this explains repulsion.

Though how this leads to attraction and how this looks as a Feynman diagram I'm not sure.


Indeed this does look very useful.

I had a quick glance over this, but for contexts which are defined as remote ssh hosts, I couldn't see where the images are obtained. Are they being built remotely on the host, or are they being downloaded from your local machine over a tunnel or via a docker registry?


Not sure why you are being downvoted:

https://www.oxfordbiosystems.com/COVID-19-Rapid-test

"In order to test the detection sensitivity and specificity of the COVID-19 IgG-IgM combined antibody test, blood samples were collected from COVID-19 patients from multiple hospitals and Chinese CDC laboratories. The tests were done separately at each site. A total of 525 cases were tested: 397 (positive) clinically confirmed (including PCR test) SARS-CoV-2-infected patients and 128 non- SARS-CoV-2-infected patients (128 negative). The testing results of vein blood without viral inactivation were summarized in the Table 1. Of the 397 blood samples from SARS-CoV-2-infected patients, 352 tested positive, resulting in a sensitivity of 88.66%. Twelve of the blood samples from the 128 non-SARS-CoV-2 infection patients tested positive, generating a specificity of 90.63%."

That gives us 62% false positive ratio according to (where a study finds the prevalence to be 6% of subjects using the test):

http://vassarstats.net/clin2.html

In some cases we have research being carried out with such low positive results that they can entirely be accounted for by the low specificity. So for example if you took samples from 100 people, based on 90% specificity, even if everyone had never had corona, 10 could be found positive.

Credit to this post:

https://old.reddit.com/r/COVID19/comments/g7f373/second_roun...

However it should be noted the article in question for this submission does not mention the type of test used.


I wonder what's the process through which false positives happen in this case. Previous infection by milder Coronaviruses?

Edit: I'm looking at the reddit post but I have a lot of reservations with the "prevalence 0.06", unless we'll use the test to test absolutely everybody and not only people who are suspect. Has that calculator been validated as well?

If the test was 12 false positives in 128 negatives, how come they can claim the false positive rate is 60%?


Apologies for the way this was linked to. The 6% is from this study:

https://www.miamidade.gov/releases/2020-04-24-sample-testing...

"Our data from this week and last tell a very similar story. In both weeks, 6% of participants tested positive for COVID-19 antibodies, which equates to 165,000 Miami-Dade County residents"

That is what the commentator is referring to in the linked post.

So if you plug their own figures into the calculator:

Sensitivity .8866 Specificity .9063

and a Prevalence of .06 based on the study, you get the 62% false positive rate.

As the prevalence increases, as with the NYC study which found the positive rate to be 21% (prevalence), the false positive rate decreases, down to 28% of the NYC study.


The password you need to Google for why it happens is "antibody cross reactivity." Not necessarily other coronaviruses but I imagine they're disproportionately more likely to cause it.


This is from ARCPoint Labs, where I took my antibody test:

The Antibody test is a serology test which measures the amount of antibodies or proteins present in the blood when the body is responding to a specific infection. This test hasn’t been reviewed by the FDA. Negative results don’t rule out SARS-CoV-2 infection, particularly in those who have been in contact with the virus. Follow-up testing with a molecular diagnostic lab should be considered to rule out infection in these individuals. Results from antibody testing shouldn’t be used as the sole basis to diagnose or exclude SARS-CoV-2 infection. Positive results may be due to past or present infection with non-SARS-CoV-2 coronavirus strains, such as coronavirus HKU1, NL63, OC43, or 229E.


Yes. That’s one possible explanation. Interestingly quite a lot of people might be somewhat immune to the new Corona virus due to anti bodies from previous Corona cold infections. More than 30% showed such antibodies in a recent study. https://www.finanzen.net/nachricht/aktien/drosten-hinweis-au... (Sorry that the only source I have ready right now)


60% is the probability that a particular positive test result is actually a false positive. It's not the overall the false positive rate.


Not sure why, but this gave me asmr tingles....


it had sound?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: