Hacker News new | past | comments | ask | show | jobs | submit | f-'s comments login

Well, potato is one of the higher-calorie crops, and one pound contains around 350 calories. If you produce 3,120 lbs a year, that gives you about a million calories.

Now, let's assume a family of three - an average person needs around 2,000 kcal a day. That's 2,000 * 365 * 3, or around 2,200,000 kcal a year. So, you come quite a bit short. And that's on a good year; you're gonna have bad years, too.

Also a function of climate and soil. In the 19th century, settlers in the plains - Nebraska, Wyoming, etc - often couldn't make it work on 640 acres granted by the government. In contrast, there are eastern states where 20 acres would be more than enough.

(Farming in the West is now much more viable thanks to deep wells and mechanical irrigation, but that's a capital-intensive and resource-intensive approach that works best at a scale.)


As pointed out downthread, you can indeed feed a family on one acre of land, and many people do actually do this.

The problem with your math is that it assumes the 3k lb yield from gp comment is for potatoes. Your crop yield depends a lot on the crop. Aparently you can get between 10-30 tons of potatoes per acre (that range is from beginner yields to expert) which would be 7-21 million calories per year. Plenty of room, then, to grow a number of other crops to eat a balanced diet.


Potatoes (if you eat the skin) and milk would theoretically be a balanced diet, supplemented with fish/occasional meat it was pretty much the Irish diet pre-potato famine.

Boring as hell after a while but it'd keep you alive.


During the famine, ireland produced way more for/potatoes then it required to feed it's people; they were just taxed to all hell.

Currently the US produces about twice the calories it requires. There are so many calories produced in forms of corn that the industry has made huge efforts to find new ways to use those calories (hfcs, ethanol, etc) in order to justify corn industry practices. The one liner is that we need to be able to feed the people, but obesity is at an all time high. People need more nutrition, not more calories.


Taxes were not the main contributory factor. "Ireland continued to export large quantities of food, primarily to Great Britain, during the blight. In cases such as livestock and butter, research suggests that exports may have actually increased during the Potato Famine. In 1847 alone, records indicate that commodities such as peas, beans, rabbits, fish and honey continued to be exported from Ireland, even as the Great Hunger ravaged the countryside." https://www.history.com/topics/immigration/irish-potato-fami...


Thanks for the clarification. I was actually meaning that they were being "taxed" in terms of food sent to the rest of the UK and not in terms of money. Even with the article you linked, it's not clear to me if "exports" means they were forced/taxed into sending food, or if they were willingly exporting food for money in lieu of eating... Or something else


Most Irish didn’t own property and were tenants. They didn’t “export” in lieu of eating, the English landlord exported the fruits of their labor and left them to starve. The English were very concerned about the Irish and their moral fiber, so they allowed them to persevere rather than get hooked to charity.

Others couldn’t pay their rent and were evicted. Millions of Irish didn’t flee to the slums of NYC, etc for kicks.


For this reason the Great Famine is sometimes characterized as a genocide perpetrated by the British landlord class.


> they were willingly exporting food for money in lieu of eating...

The "they" who decided to export were not the "they" who starved.


Ireland could not affordably import food, due to the Corn Laws.


You can also get pretty complete with potatoes and oatmeal.


You can maybe keep a small family alive on an acre, but not really feed long term. With such little space you need to dedicate nearly the entire plot to high caloric crops such as corn.


Each year, I grow an increasing amount of my food from less than an acre. Corn is not an efficient way to get calories in an organic system, it requires too much nitrogen. As others have mentioned, potatoes are much better. And beans/peas, which require achieve their nitrogen requirements via bacterial symbosis. If you really want to go efficient, grow Azolla. I once calculated that a person in a temperate climate could produce most of their nutritional requireents with a plot roughly 25 square meters. Azolla doubles in mass every 2-3 days in ideal conditions and draws everything but trace minerals from the atmosphere.


Wikipedia seems to think that Azolla might be neurotoxic.


Yes and no. When stressed, it produces a neurotoxin.


You math is wrong. Way wrong. 70lb per week is 10 pounds per day. I'm not sure its feasible for a person to eat that much. Maybe a family could. Maybe.


It's frequently claimed that the average Irish male just prior to the potato famine ate well over 10 lbs of potatoes per day:

On a typical day in 1844, the average adult Irishman ate about 13 pounds of potatoes. At five potatoes to the pound, that’s 65 potatoes a day. The average for all men, women, and children was a more modest 9 pounds, or 45 potatoes.

https://slate.com/culture/2001/03/putting-all-your-potatoes-...

While there are people who doubt these figures, eating 10 lbs of potatoes per day is definitely more plausible than your comment indicates.


I have lived almost exclusively on potatoes at several points in my life, and there's just no way. I've watched a bulky manual laborer live almost exclusively on potatoes as well, and still not near 10 lbs a day.


This is kind of inconceivable to me. Is it because I’ve never done enough manual labor to eat 65 potatoes in a day? I can’t imagine even finding the time to cook and eat them.


Arguing against the article, 1/5 lb is a pretty small potato. I just weighed a baseball sized potato I dug last week, and it was 7.5 oz (~1/2 lb, 200 g). So if you are picturing an average baked potato, it's probably only 30 potatoes per day. Which, granted, is still a lot.

Cooking time doesn't strike me as a problem. Boiling 30 potatoes does take somewhat longer than boiling 1 potato because the mass is greater, but it's pretty much boil and forget. Also, the boiling can probably be done by one of those women or children who are only eating 10 potatoes a day!


Probably because they weren't getting enough protein. It's easy to eat carbs forever if you have nothing else because you'll still feel intensely hungry without protein.


And if the calorie count upthread is right, that’s 4550 calories per day purely from potato, not counting any added butter, milk, beans, meat etc! I know people did more manual labour back then, but something seems wrong with that figure.


They may not have had access to much of (barely) higher end foods. I read somewhere that the English, who had conquered them, took away a lot of that sell / use in England, including beef.

Update: Found where I read it - Wikipedia:

[ The Celtic grazing lands of ... Ireland had been used to pasture cows for centuries. The British colonised ... the Irish, transforming much of their countryside into an extended grazing land to raise cattle for a hungry consumer market at home ... The British taste for beef had a devastating impact on the impoverished and disenfranchised people of ... Ireland ... pushed off the best pasture land and forced to farm smaller plots of marginal land, the Irish turned to the potato, a crop that could be grown abundantly in less favorable soil. Eventually, cows took over much of Ireland, leaving the native population virtually dependent on the potato for survival.[41] ]

That quote is from the section:

Potato dependency

in:

https://en.m.wikipedia.org/wiki/Great_Famine_(Ireland)


Pre-industrial agricultural labour is incredibly calory-intensive. 4000-6000 calories sounds about right for a day of agricultural work.


I used to eat about that much a day when I was running twice a day (and I was losing weight while doing so). Michael Phelps was known for eating 10,000 calories a day.


His math is correct. Also, it's really easy math to verify. It's not surprising either. With the stated yields, you could feed a person. You'd need more for a family.

Of course, the yields could probably be improved in order to get the 140 or so lbs needed to feed a family of 3


The math is correct but the calorie count per pound potato is way off. It is 350 kcal not 350 cal.


In the US, 1 Calorie = 1 kcal elsewhere.

We refer to kilocalories with capital-C "Calorie". I don't know why.


Probably because lower-c calories are meaninglessly small for like 99% of the population. It'd be like doing all your cooking in milligrams.


When I used to work heavy manual labor I was definitely able to easily put down around 7-8 pounds of food per day. One of my coworkers was into weight lifting and estimated we were burning between 3,000-5,000 calories per day depending on the job.

10 lbs of potatoes is only 3,500 calories. Which, while far in excess of a "normal" sedentary lifestyle, is completely reasonable for people working heavy labor jobs if that's their primary food source.


A typical person eats 3 to 5 lbs of food per day, and that generally includes some very calorie rich meats, dairy, nuts and/or processed foods - not exactly the stuff you'll get in your backyard. You might be able to feed a family on an acre of beans, but not on an acre of generic greens.


A 10lb bag of potatoes isn't really that big. If you take one of those bags, split into thirds, as in three meals a day, that's a bit more than 3lbs per meal. If that's your only source of food, that doesn't seem entirely unreasonable.


Potatoes have great yield in terms of calorie per unit harvested. Other plants, not so much. Think about how much weight you might throw away with a pumpkin or a watermelon.

I have some family members who devoted about an acre and a half to a mix of crops, squashes mostly as they grow the best where they live, and for a family of 5 still had to supplement their diet pretty significantly.

At the same time, they ate more healthily than they ever had before, and often had too much of certain crops that they gave away or sold at the local farmer's market.


The magic valley in Idaho (named after its transformation) is a great example of this. Modern irrigation has turned what is naturally a high dessert climate into a food producing powerhouse.


If you do that, you're merely trading one grievance for another: "evil company marked my bug as duplicate to avoid paying" for "evil company claimed to have gotten duplicate reports to weasel out of paying the full amount". More people upset, although individually, maybe to a lesser extent.

The core issue is not the reward division algorithm, it's the inherent lack of visibility. One solution here would be to just open all reports after a while, but this creates problems of its own. One is that it gives ammo to people engaging in dishonest or clueless PR. Another is that some researchers don't actually want visibility, because their employers have murky rules around such engagements, or because they have some far-off disclosure timeline in mind (as a part of a presentation at a conference, or whatnot).


How about combining reward splitting with having some way to show a count of recent submissions (no details, just count), so people know before submitting that there is a chance someone else already submitted the issue.

Or a mechanism for companies that use email to register the researchers submissions in HackerOne. The details will be sealed and non-public, with researchers having no way to know it exists unless the company provides a link to it as proof of work. HackerOne thus acts as a kind of notary against accusations from researchers that it wasn’t really a duplicate.


How about dividing among reporters, bounty increases the longer it's not fixed since first report, and those paid must be publicly acknowledged.

Probably also need stiff penalties for insiders who might conspire to notify others of bugs and split the pay out.


Nobody is going to do anything like this. Bug fixes take time to coordinate and deploy, and nobody is going to make themselves and their schedules accountable to some random bug bounty submitter. At the point where you're doing this, you might as well just engage professional pentesters; they don't give a shit when you ship fixes --- you just pay them to find bugs and write them up.


The trouble with your first point is that companies won't go for it; no point in having HackerOne around if no one will use their platform. It's a tricky problem; let's solve it with AI and Blockchain!


Why not divide the payout? The companies paying will pay the same amount, just divided among all the reporters. They already do the work of identifying duplicate reports. Maybe it could be weighted to pay more to the first reporter.

As far as not doing it. At some point critical industries may be have to be regulated to force them to behave responsibly.


As a photographer, the comparison to "raw" results without color balance or noise removal seems somewhat deceptive. The effects visible in the video seem easy to quickly replicate with existing techniques, such as the "surface blur" filter that averages out pixel values in areas with similar color.

This happens at the expense of detail in low-contrast areas, producing a plastic-like appearance of human skin and hair, and making low-contrast text unintelligible, which is why it's generally not done by default.


The comparison is fair because it tries to automate expertise.

I'm sure you know exactly how much of which filter to apply for similar results. Laymen like ourselves will need a lot more trial and error. Their contribution here is to provide a push-button, automated mechanism.

I would have probably also tried something simple and given up due to the noise. So this is definitely interesting.


This is completely untrue.

What you are describing is usually called automatic tone mapping. This is basically noise reduction and possibly color normalization from brightening a dark image. Them showing their black image as the starting point is silly, because jpg will make a mess of the remaining information. What they should show is the raw image brightened by a straight multiplier to show the noisy version that you would get from trying to increase brightness in a trivial way.


What jpg? they are using raw data.


Image on Github is JPEG made from RAW. Since RAW file has more dynamic range and contains a lot more information than JPEG you can take that photo in an editor and crank up the brightness. You will get a noisy image but it will be a lot brighter and will probably resemble the image with the high ISO in the middle. Then in an editor you can apply some de-noiser to get results similar to the last one.

So presumably this neural net more or less does it for you.


The *PNG is there just to show the results produced by the CNN, if you watch the linked video they do exactly what you are suggesting and then compare both results.


Their example on their github page uses a jpg that makes it look like they are creating something from nothing.


To me the results seem vastly superior to those sort of simple DSP algorithms. The video shows a comparison with some denoising: https://youtu.be/qWKUFK7MWvg?t=102


Your example strikes me as the kind of thing neural networks are much better at than a fixed filter. You or I could easily identify regions of an image where it's safe vs unsafe to do the surface averaging, and boundaries where we wouldn't want to mix up the averages. (For example, averaging text should be fine, so long as you don't cross the text boundaries.) A CNN should also be able to learn to do this pretty easily.


What you are describing is a class of filters known as edge preserving filters. You can look at bilateral filters and guided filters for examples that have been around for decades at this point.


So we can do a decent job with hand designed filters... Why aren't they in use in the problem the parent describes? Are they not good enough to deal with small text boundaries?

A lot of hand built filters (I see a lot of these in the audio space) have many hand tuned parameters, which work well in certain circumstances, and less well in other circumstances. One of the big advantages of NN systems is the ability to adapt to context more dynamically. The NN filters can generally emulate the hand designed system, and pick out weightings appropriate to the example.


This is effectively noise reduction, which bilateral and guided filters are actually used for. They take the weights of their kernels based on local pixels and statistics. You can also look up other edge preserving filters like BM3D and non-local means.

I don't know what you mean by hand made filters and I don't know why that's a conclusion you jumped to.


>As a photographer, the comparison to "raw" results without color balance or noise removal seems somewhat deceptive.

Huh? At 1:40 in the video that's exactly what they do.


Interestingly, this effect is notably visible in their example image [0]. Notice the distinctly "plasticized" appearance of the book cover, and how the text is not intelligible in the low-contrast areas of the reflection.

[0]: https://raw.githubusercontent.com/cchen156/Learning-to-See-i...


Note (a) and (b) are separate photographs (different angles and everything), and that (c) is based on (a), not (b); comparing the glare between (b) and (c) isn't quite an even comparison.


Oh gosh. Taking one second to think about it, _of course_ (a) and (b) are separate photographs -- that is the entire point of that diagram. Somehow my brain farted right over that when making my previous comment.

Thank you, not only for setting me straight, but also for doing so as kindly as you did.


It would indeed be interesting to see a comparison with for instance non-local means on the scaled raw image. The speed is superior in any case, I suspect.


i always have this complaint too. its fundamentally a lossy process, in the hand wavy sense. its more "impressive" looking, but actually conveying less real detail.


One of my design goals for AFL was to make it very simple to use - because there's plenty of fuzzers that work OK when you dial in 50 knobs just right, but fail spectacularly otherwise - basically ensuring that nobody but the author can really use the tool to its full capacity.

While AFL++ is cool, it sort of ditches that philosophy, giving you a lot of options to tweak, but not necessarily a whole lot of hope that you're going to tweak them the right way. So, that's one gotcha to keep in mind.


This shouldn't be understated. I went from having never run a fuzzer to having a 16-node run executing for thousands of machine hours on a somewhat unfamiliar C++ codebase with minimal effort.

Both the tool and the documentation made it easy for me jump in, identify bugs, write new test cases, implement a fix, and verify the fix passed without issue. I've mentioned it on HN before, but AFL taught me how incredibly difficult it is, even for experts (think most senior engineers at a FAANG) in the field, to write C++ without security vulnerabilities. I was even able to find and fix bugs which were previously reported but no one was able to reproduce reliably.

If there was an AFL t-shirt, I'd wear it ;-)


(Author here)

The funniest part is that this ugly hack kept working across platforms for many years; whereas when somebody else implemented a "proper" integration with the clang / llvm API, their solution proved to be extremely fragile. The API wasn't stable between compiler versions, and because it wasn't really used much, it had all kinds of bugs, including being outright unusable at times.

Also, most distros packaged clang in a way that made it impossible to compile the plugin, because of missing or mismatched headers, missing companions tools, etc. So you had to download and rebuild the whole compiler, which took hours (and that's if you didn't get stuck in a dependency hell).

So yeah, this was very much a lesson in "worse is better".


+ it allows working with GCC-based compilers (GNAT for example) and mixed C/Ada code bases.

Although the instrumentation via asm patching works well in most cases, it can break down in strange ways. See (shameless plug) : https://blog.adacore.com/running-american-fuzzy-lop-on-your-...

Amazing tool, quite extensible (adapted it to work only in memory+tmpfs without touching disk - specific corner-case... Very easily) and readable. It's also funny to scale to multiple cores and multiple machines.


One other benefit for me personally is that finding this little chunk of code really demystified a lot of what was happening. It’s been a few years but I still recall the aha tingles from the whole thing.

I suck at assembly but this proved to be a nice spot to hack away at it (with copious googling) to even further improve my understanding of not only the instrumentation but the effects that mutations had on program execution.

So for me worse was definitely better.


Disappointed that my LD_PRELOAD exploit - still unpatched after 20 years! - did not make the list:

http://lcamtuf.coredump.cx/soft/ld-expl


I'm afraid it only got worse.

  marek@mrnew:~$ unshare -Ur
  root@mrnew:~# 
Not only it's easier to do than LD_PRELOAD these days, you actually _do_ get real elevated capabilities.


But you are not really root, after that, you only think that you are.


Does objective reality exist, my friend?


I mean, in this case other calls will fail with EACCESS, so here, yes.

And I've done a similar thing for an integration test framework for low level daemons, so I know very well how much of a pain it is to get close to emulating "oh yeah, you're totally root" to processes via LD_PRELOAD.


You can refuse to accept an inheritance. It goes to whoever is next in line. If everybody refuses, the state gets to keep it.


You only get to refuse inheritance in full in most jurisdictions. You do not get to pick and choose to refuse.


Hence GP's counterfactual, "if this piece were the only thing you inherited and you were broke".


Sad though to refuse a 500k USD house with a negative 29 million USD bird in it.


Ok, fine - let's say you're going to inherit $100k and the art.


"The Population Bomb" was a pretty fashionable scientific prediction back in the 1960s which had a significant following in the academic and policymaker circles at the time.

It extrapolated from data about population growth, farmland capacity, etc, to reach the irrefutable conclusion that there is going to be mass starvation and famine in the 1970s. It led to calls for China-style population controls, to articles about whether it's ethical to have children, etc.

What happened instead is that population growth has slowed down quite a bit without government intervention, and that we've gotten a lot more efficient at growing food.

This does not prove anything when it comes to climate change, but is an interesting anecdote.


Thanks for bringing this up! An excellent book on this is: The Bet: Paul Ehrlich, Julian Simon, and Our Gamble over Earth’s Future by Paul Sabin


I'm no expert, but "blackmailing a bomber" does not sound like a particularly solid business plan...


The former possibility is problematic, but using it to get the police to harass someone is a real possibility. Just look at "swating" in the US.


Sounds like the "brilliant" idea that a Coen Brothers character would have...


It has to be up there with some of the all-time bad notions. Personally my mind immediately went to that scene in The Dark Knight...

https://m.youtube.com/watch?v=AUfv32dmVFw


Hey - I'm the author of that page. Your comment is a common misconception, but the animal pictured is actually a golden-mantled ground squirrel. You can tell because the stripe doesn't extend to the eye. Thank you for subscribing to squirrel facts!


I can attest to the veracity of your claims. :-) [0]

[0] https://news.ycombinator.com/item?id=12264275


Oh! You're right. Confused him with an Eastern Chipmunk.


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: