Hacker News new | past | comments | ask | show | jobs | submit | JoeCoder_'s comments login

I first read this as "a soldered-on GPU" :O


To achieve asic resistance, why not switch between a large pool of different algorithms sequentially, with their order and various parameters determined by the hash of the previous block?


As TFA points out no matter how ASIC-resistant you try to make your PoW algorithm you're just delaying the inevitable. If any of these cryptocurrencies actually manages to become the currency of the future the incentive to increase the hash-rate-per-kW by a few percents to be worth it.

Having a pool of different algorithms might improve the difficulty but it will also make it tricky to audit all algorithms for vulnerabilities. On top of that an ASIC might not have to implement all algorithms, they could idle and only spring into action when an algorithm they implement is selected for instance.

The main point of the article stands, no matter how complicated you make your algorithm a special-purpose solution will always be more efficient than a general-purpose one, it's just a matter of balancing the cost of developing the ASIC vs. the expected return.

If, as the article points out, developing chips for minor cryptocurrencies using ASIC-resistant PoW is cost-effective I can't imagine how anybody could hope to design as ASIC-resistant cryptocurrency designed to become the currency of the future. Think about it, in the unlikely scenario where a PoW cryptocoin eventually replaces the dollar the mining rewards will quickly amount to billions of dollars. The incentive to get an edge, no matter how small, would be tremendous.

An alternative scheme would be to change the change the PoW algorithm regularly like the article says Monero is doing, but then you give a huge amount of power to the people selecting the next PoW. Consider how incredibly tempting it would be to develop an ASIC for some PoW algorithm and then have it selected by Monero, you'd have a huge head start.


You could make one asic that supports all algorithms, and re-uses parts that overlap, or you could even create one asic per algorithm. This substantially increases the amount of design effort required, which puts it rather out of reach for smaller companies. But Bitmain could easily pull together a design effort of that scale, so they are even further advantaged there.


Wouldn't an ASIC capable of executing hundreds of types of algorithms more or less just be a GPU anyway? At that point why not just use GPUs?


Based on the article, it wouldn’t be difficult to design an ASIC that handles multiple algorithms and accepts parameters for each.

For mining chip manufacturers the gamble of choosing a few algorithms to support, especially if constraints are promised at the start of the PoW launch, can have favorable odds. There’s only so many PoW algorithms from which to choose.


Serious question. Why is ASIC resistance a good or desirable quality for crypto currencies to strive for?


The entire point of cryptocurrencies is to be decentralized, that's the key innovation. As long as I trust the overall system, I need not trust any one individual. If you're using a cryptocurrency that's centrally controlled you might as well use a fiat currency, at least those are in theory regulated.

When it comes to ASICs, it's very difficult for you or me to buy them outside the US or China. Someone wanting to set up a mining farm with 500 of them can probably arrange something, but if I want to buy just one here in Iceland, I'm screwed. No one ships single ASICs to Iceland for a sane shipping fee. Most won't ship them outside the US at all.

As a result, almost all the hashpower in bitcoin is coming from the US or China, and from the rest of the worlds perspective that's not good. It is /definitely/ within the Chinese governments power to seize all of bitmains strategic reserve of chips and use them to perform a 51% attack on bitcoin.

The benefit of ASIC resistance is that it forces everyone down to graphics cards or at least FPGA boards, and both of those can be bought over the counter in any western country, so it spreads the hashing out geographically, which is desirable from a resilience standpoint. If ASICs were widely available it would be a different matter, but since they aren't, it's overall better to force them out of the game for now.


One reason to stop it I guess is to prevent a small group of miners from carrying out a 51% attack.


Doesn't it achieve opposite effect? All smaller, non ASIC resistant currencies can be bullied by someone with a lot of (rented) PCs. Ignore 51% attack, just sudden changes in difficulty & time between blocks would introduce chaos.


ASICs are specialized equipment which usually costs a lot. That leads to centralization, which is not something you want for a cryptocoin. Just look at bitcoin with their four people controlling a majority of hashpower.


Resisting single-chip ASICs, by large (on the order of a GB) memory requirements, may be desirable in promoting the use of commodity memory chips, which would account for most of the power consumption and hardware cost, leaving the ASIC tying the memory chips together much simpler, and running much cooler.


The majority of miners are not a lot more than hobbyists. I am, too. They prefer GPU mining because they know GPUs better than ASICs and because GPUs are more flexible than ASICs. The barrier of entrance to mining is lower.

Additionally, imagine you bought GPUs. Then of course you are going to oppose to changes destroying the profitability of your GPUs. If you then read about someone writing about ASICs fostering centralisation, you'll be inclined to approve such statements.

However ironically it has been shown that quite possibly GPU mining is more prone to centralisation attacks than ASIC mining.


Why is proof of work a desirable quality? I believe the answers to both questions are the same.

If something designed to be hard can be made easy, you lose whatever goal you wanted to achieve by the former.


You just described x16r aka Ravencoin, developed by Overstock.com for ASIC resistance


There is Verge, which does this. https://vergecurrency.com/faq/mining/


Actually Verge does the algorithms in order so it’s more predictable. x16r aka Ravencoin is what the poster was describing


How long until I can use this as a Photoshop plugin?


There's been content aware fill for a long time: https://youtu.be/Ge9jsJZ3lA0?t=245 It's not the same backing tech but for practical purposes it's as good.


Current inpainting algorithm is good for removing wires or adding textures. The new nVidia technique should be able to synthesize new information not in the original image


I doubt Adobe's content aware fill could replace someone's eye or a few of the other examples in the video.


About -10 years ago (at least that's about when a gimp script doing a version of this was put out, Smart Remove / Heal Selection -- https://www.youtube.com/watch?v=3h1gZJsjKxs -- I recall Adobe demoing similar stuff at various points though I suppose this is the first time someone's used a generative neural net to fill things in rather than just nearby pixels)


Years


I'm hoping they can use some of this cash to make a better desktop client:

1. That can be minimized to the system tray.

2. That can be used when behind an http proxy server.

3. Doesn't require a phone to use.

4. Doesn't take 200MB ram to run.


You seem pretty passionate - wanna help us test proxy and system tray support?

Tray support behind command-line argument: https://github.com/signalapp/Signal-Desktop/pull/1676

Proxy support behind command-line argument: https://github.com/signalapp/Signal-Desktop/pull/1878


So, what are you using the 15,800 MB of RAM for?


Running Slack. Gosh


Running 4 instances of IDEA, 2 Android emulators, gradle to build an app, and a browser?


100 tabs on Chrome?


Without tree style tab?


And turn crappy music into good music.


No option to minimize to system tray? So I have to have a Signal App taking up room in my taskbar all the time.


> There's no good technical reason not to enable lossless compression

Why would you want to losslessly compress audio or video? The resulting files would be huge--often too huge to steam. Lossy compression is what you want.

I must be misunderstanding the article?


My guess is they mean "visually lossless" i.e. "lossy" compression but with high quality settings so the user won't notice any difference.

That's a slightly odd way to phrase it, and I'm amazed that it's not a basic part of any streaming service these days, but makes more sense to me than the other suggestions.


I interpret that as meaning the difference between using stream copy and re-encoding videos when converting between different formats.

For instance, the user might upload an H.264 video as an MP4 file. Streaming services generally won't serve the MP4 file itself, but will create an HTTP Live Streaming (HLS) version where the video is split into multiple .ts chunks along with a .m3u8 playlist.

However, such services (including YouTube) will generally re-encode the video when doing so, and the end result will have poorer quality than the original uploaded file.

This is actually unnecessary because .ts files support the H.264 codec, so the video does not need to be re-encoded. Instead the streams can be copied to retain their original quality.

If additional compression is necessary, I suppose it would be possible to apply HTTP compression using algorithms such as gzip or deflate.

Of course, this is purely speculation. It would be nice if Cloudflare clarified what they meant. But it would certainly be great if they used stream copying where possible instead of re-encoding.


If you read on for context, it sounds like they are comparing to the case where the customer simply sends some uncompressed video file, and pointing out that you might as well enable lossless compression in this case. However, CDNs (or somebody) introduce technical obstacles for this because they charge by bandwidth, so compression would hurt their profits.


> uncompressed video file

1920x1080 at 3 bytes per pixel and 30 frames per second is 11GB for just one minute of video, and 186MB per second. That's far too big and expensive to stream, and outside of somewhat special circumstances I've never heard of anyone storing video in such a huge format.


Agreed. We don't have the whole story here, I'm just trying to understand from context. Perhaps the thing that's being sent is not a raw video file, but some other file that could still be improved via lossless compression.


I would assume because the CDN customers make downstream promises to their own customers that the data will maintain losslessness. There are two potential reasons why that guarantee might be useful that I can think of:

- your product is explicitly lossless as a feature (like Tidal for streaming lossless music)

- you have to provide the files to others for reasons other than watching or listening (like editing), where integrity matters, and the use case/context requires a stream instead of just a file download


I dont think theyre saying lossy isnt useful - but that too many people arent doing _any_ compression at all. Thats the way I read it; I could be wrong.


When a private company has a massive failure, customers have the freedom to go elsewhere and the company may go out of business.

When a government agency has a massive failure, we're of stuck with it, short of hoping politicians might do something about it.


That's sorta the point – in this case, I have no ability (as a consumer) to "go elsewhere." Aside from the two times I've placed freezes on my credit history, I have no relationship with Equifax, and have never consented in any way to what they collect and share about me.

This contrasts pretty heavily with, say, Facebook – where even though they collect and share plenty of information about me, I'm at least both consenting and continuing to feed them by using a service I presumably find worth the trade-off.

If you are the type of person who gets upset about this dynamic of Facebook, then Equifax should be completely next-level.


True, but at least third parties (where they check Equifax, etc.) have a choice. With government, no one has a choice.

I do agree that handling of data could be regulated to some sort, though. Much like how we have agencies, etc. for keeping tabs on companies that handle other things (hazardous materials, food, etc.)


You do, too, have a choice with government: you leave. Unless you live in North Korea or a similarly dictatorial state, you have a choice to leave your country.


> and have never consented in any way to what they collect and share about me.

Dont you consent to credit reporting when you sign up to get credit, though.

You can "choose" to not use Equifax by not getting credit from people who report to equifax.


> Dont you consent to credit reporting when you sign up to get credit, though.

No, I consent to allow creditors to do their due diligence to see if I'm a viable candidate for their product. Equifax is not required in this transaction, it's just beneficial for the creditor, their customer, to use them to streamline that information collecting process. Loans and credit existed prior to credit reporting agencies and will exist long after.


> No, I consent to allow creditors to do their due diligence to see if I'm a viable candidate for their product.

you do much more than that though , from my card agreement

"We may obtain and review your credit history from credit reporting agencies and others. We may, from time to time, obtain employment and income data from third parties to assist us in the ongoing administration of your account. We may also provide information about you and your account to credit reporting agencies and others. We may provide information to credit reporting agencies about this account in the name of an authorized user. If you think we provided incorrect information, write to us and we will investigate."

> Equifax is not required in this transaction

Thats upto the business to decide, not you. You choose to not get credit from them if you have objections to how they run their business.


>Thats upto the business to decide, not you. You choose to not get credit from them if you have objections to how they run their business.

Regardless of what the business decides, no, Equifax is not required to assess my credit. If the business decides to go to McDonald's to get lunch while they are doing this process, that's not required regardless of what the business decides. It may be nice for them, it may make their process easier, but it's not /required/.


i agree that its not required.


That is not a pragmatic solution. An American adult pretty invariably has to interact with the credit system as part of life.


Yea i guess the pressure should be on the lenders who use equifax not directly on equifax.


most of the language in those agreements just says "major bureaus" though, not explicitly which. And major transactions will usually pull all three. Trying to pick only lenders which don't use Equifax (or another bureau) may not even be possible for something like a car or home loan.


How would that possibly work? There are (I think) 3 brokers and creditors tend to share info back to all of them.


This is true, but it omits the extremely relevant third case.

When a private company has a massive failure that impacts someone other than its customers, then the impacted parties have no recourse whatsoever.

You're not an Equifax customer. Banks and big businesses are, and they don't really have a strong incentive to punish this sort of behavior.


> When a private company has a massive failure that impacts someone other than its customers, then the impacted parties have no recourse whatsoever.

IANAL. This is absolutely not true. You don't have to have a direct relationship with the other party in a negligence case. An easy example would be that you can sue the manufacturer of your transmission even though your Civic was sold to you by Honda. Equifax would have to somehow argue that they owe no duty of care to the people with private information in their database which just isn't going to happen.


Yes, technically this can be addressed by a class action suit. Which is better than nothing, I guess.

This has always struck me as an odd anti-regulatory argument, though. "Governments shouldn't use their power to distort the market. Therefore regulations are inherently bad. Instead we should distort the market using a different branch of the government, but in a much more capricious and unpredictable manner."


Standing and damages must be shown.


The authors point is still valid with that argument though -- the people with the greatest risk from participation with credit bureaus aren't their customers and don't have a right to decline participation. It's a lopsided marketplace where the market incentives aren't aligned with the risk.


Your second premise assumes that citizens are powerless to effect change, which is sadly more true these days, but in a proper functioning republic, government agencies are accountable and fixable.


In a proper functioning free market, companies are replaceable and reasonably transparent, so people can arrive at decisions and 'vote' with their actions in a practical sense.

I think the key point here isn't the system, but 'proper functioning'. In the event that no system is ever properly functioning, what's causing the least damage?

I'm reminded of the many stories I've seen claiming that people given experience of both, preferred 'bread line' communism to free market capitalism.

These people were never the lottery winners of capitalism: they're always just worker bee types, and their observation was that communism sucked but was stable and consistent. Small dreams, small risks, reasonable safety for the worker bees.

When they experienced capitalism, they had no preparation to competitively attack their fellow workers and win, and as a result they didn't win, they became worker bees under capitalism, and dropped below their previous living standard. Granted, this provided an environment where more ambitious worker bees could prevail and get rich, and that's the system working as intended, but for the less motivated ones, their experience told 'em communism took better care of them… in spite of the litany of horror stories we've all heard over and over (I live in the USA, so I've been indoctrinated against communism all my adult life).

I think it's pretty important to examine how these systems work when they're NOT properly functioning. It seems like free market capitalism produces some pretty spectacular damage when it catastrophically fails, but it's more abstracted. When something like communism catastrophically fails, it's in the form of state genocide, with more intentionality.


It sounds like the abstraction of capitalistic damage is a feature


Sadly, being the commodity and not the customer, even though it directly affects our lives, and in spite of it being a private company, we can't do anything about it. And is the hope then that Equifax will go out of business? Are we under the assumption that either of the other two are better, or are they just the same, and they haven't had their day in the spotlight yet?


Did you even read the article ? The whole point is that as a consumer you have no choice to go elsewhere. Equifax has data about you that you never know and never consented to. They also sell their data to all sorts of agencies and their business practices are real shady. If you need to check your credit report multiple times in a year, they will sign you up for some useless monitoring program, and will make life really difficult to cancel it. At the very least, credit bureaus, who have enormous power over individual lives, without their knowledge, need to be made accountable and their profit motives squashed.


None of those apply to the CRAs. We aren't their customers, yet all of our data is held by them.


Why not develop tox instead, which is open source, end to end encrypted, on more platforms, and seemingly further along in general?


Maybe because tox is developed by people who don't know what they're doing.

Money quote from a tox dev: "Tox provides some strong security guarantees. We haven't got to the point where we can enumerate them properly, given the general lack of understanding of the code and specification."

https://github.com/TokTok/c-toxcore/issues/426


Briar is also e2e and open source. It also has a ton of mesh networking features that Tox doesn't have.


Can you say a bit ore about the mesh network features of Briar?

I've looked through a lot of the documentation and can't find anything other than references to bluetooth and wi-fi, which is opaque to me.

I'm kind of wondering about something like briar, but that can connect over a cjdns network if available... is that what it's doing?


I once tried to read their "protocol documentation" and realised that it was effectively non-existant and the only way to understand what was going on was to read the toxcore code which was written by 4chan.

I'm not a crypto expert, but I also personally wouldn't put much stock in the security of their protocol or implementation.


I have the impression Tox withered and died, which is really sad considering how usable (when compared to alternatives) it is.


FWIW, the git repository looks like it was worked on within the last couple of days.


Why not have a "this page would like to perform work in the background [allow|block]" notification?


I'd guess it's because most users would have no idea what to choose, would have no idea what the affect of each choice would be, and if you add more text to explain it, most users will not read it. Pop up the notification too often and it becomes a nuisance.

Better IMO to bury the choice deep in settings or something, for power users, if you even offer the choice at all.


> I'd guess it's because most users would have no idea what to choose

Sounds like you could kill two birds with one stone: allow a non-lame web and encourage users to learn what that means. We can't complain about tech illiterate users with one corner of the mouth and then make them decide what goes with the other.

> if you add more text to explain it, most users will not read it

They're not users. They're the blight on the back of users. Why cater to them? Is someone who looks left and right before crossing the street a "power pedestrian", too?

"This tab wants to use a lot of [RAM/CPU/GPU] while it is in the background, is that cool with you?

If you don't know what this means, it's best to select no. Fuck that site anyway for using a lot of resources without making it obvious that it's doing something heavy, or asking first, or explaining what is going on. If they don't like users clicking no or even navigating away from the page, that's their problem. If you read this far, click here to claim your Avid Reader Achievement Trophy."

Not bad for a first draft?


> They're not users. They're the blight on the back of users. Why cater to them?

actually they are about 90%+ of users

https://www.nngroup.com/articles/computer-skill-levels/


Fine, and most "webmasters" are really just hack frauds. On my planet, it's their job to get a room and ours to burn it.

edit: Usage implies intentionality. Intentionality implies awareness. Not being some super guru, but more awareness than "is text, won't read". Therefore, while the drive to shovel hardware on people, to create needs you then get to fill, did indeed bring all sorts of people who now get called users to the table, I don't consider them such. It's like someone who has a seizure on dance floor is not dancing, or like someone you forced to be somewhere is not "visiting that place". And I know people don't find it rude to just ignore a claim of how something ought to be with "but this is how it is, which is why it can and must be that way", which is why I say what I say. If someone does bad things and won't discuss them, it's absolutely up to the people who are aware and care to act. If you don't understand poetic language unless it's flowery that's your problem. So yes, I stand by what I said, just not by what it might evoke in the minds of some readers.


This sounds a lot like the arguments in the 80s about how GUIs were bad because they let anyone use a computer. Empathy makes it easier to write good software.


Without special APIs to request it, this would be as ubiquitous as cookie warnings, making it highly annoying (and thus a bad business decision for chrome). And at least for the case presented in the article proper APIs without this throttling already exist.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: