Hacker News new | past | comments | ask | show | jobs | submit | dmcdm's comments login

I've been an avid backpacker for about 20 years now, and my go to for paper maps has always been USGS 7.5 minute quadrangles. As the article mentions, they are the "gold standard" for many.

Recently I've had some fun playing around with making my own in https://inkatlas.com/ which is a great idea and hope it stays viable for the creator.


Their online store is, sadly, paleolithic. I tried to get some for a Grand Canyon trip, and it should be noted that when they say it might be 3 weeks before the maps are printed, they mean it. And if you call to check on the order, you're likely to find that it's been thrown over the wall to the printing department and nobody who answers the phone can answer a question. And if you'd like to cancel the order, it's too late because it's been thrown over the wall.

Great maps, lousy customer service, unfortunately :-(


For my Boy Scout troop’s bigger hikes, we would download the PDFs, mark our trails with highlighters in Preview, then print and laminate them to circumvent this issue. Some of the maps are still good to use today, a decade later.

Caveat that I have no idea if this is/was legal, but there’s ways to get physical copies of the maps without dealing with the bureaucracy :).


Mounting maps. When I was in the Boy Scouts we learned to mount USGS maps on cloth.

I found a blog post which describes the process.

When I was in the Scouts, we would visit the USGS office in Palo Alto, CA to buy topo maps. Visit the USGS site to see if there is a location near you.[2]

[1]: http://www.mrgus.net/2017/02/maps-for-going-afield.html [2]: https://www.usgs.gov/connect/locations


Super fascinating, I'm going to try this out. Thank you for sharing!

Where we were hiking, it was unfortunately raining non-stop for a couple of days, and the laminated plastic made a big difference.

But you can always read the map under covers or something, as long as it's transported in a watertight bag, I think the cloth solution might last longer than ours did (plastic forms creases).


You can buy the quads at one of the stores in the park


At the time, one could not. Nor could one buy the quads at REI in Vegas, nor in any of the stores I called in Flagstaff. I was surprised by this, believe me.


Interesting. I didn't know about Ink Atlas. Thanks.

I primarily use https://mytopo.com/.


Inkatlas looks cool, and they're from Estonia and use OSM data. The Estonian community is small but there are some very talented mappers there. Thanks for sharing, I didn't know of Inkatlas yet!


I have a friend whose wife is from the Netherlands, and she confirms this nearly has in-joke status among the Dutch - "we speak English better than the rest of Europe because we're in love with American TV and we can only get it with subtitles".


Really? I take this as evidence indicating basically the opposite - fundamental human nature, as reflected through our communications and business practices, hasn't changed significantly in thousands of years. Some folks might be courteous, giving others the benefit of the doubt in hopes they'll "catch more flies with honey" or, more-likely, due to pervasive social norms and taboos. Others might issue thinly-veiled or direct threats, hoping to coerce the desired outcome. Either way, there have been shady business people and the ripped-off, the scammers and the scammed, for going on 4 millennia, at the very least, and there really is no sign the status quo going to change. If anything, this should dispel our romances of the human condition. The form is immaterial to the underlying function of the communication. "I feel ripped off, please deliver the goods."

~1800 years later, and 2 millennia ago, the graffiti at Pompeii http://www.pompeiana.org/Resources/Ancient/Graffiti%20from%2... tells a similar (and somewhat more humorous) story but in a different context.


That link is fascinating, thanks for sharing. Two minor things that stuck out to me:

- "Kosher garum" - didn't see that coming!

- "På stora svenska online casino kan du alltid hitta casino spel från Netent, spelen har översatts till många språk inklusive svenska." I don't speak Swedish but is this randomly injected online gambling spam?!


> På stora svenska online casino kan du alltid hitta casino spel från Netent, spelen har översatts till många språk inklusive svenska." I don't speak Swedish but is this randomly injected online gambling spam?!

Yes, it looks like the page has been compromised.


> The form is immaterial to the underlying function of the communication.

The form is very much pertinent to the outcome. If I come at you with expletives, it will produce a different result, even if just in your internal state.


Not gonna lie, pretty much the only reason I clicked the article was to see a video of rats driving tiny cars.


I, too, prefer the silly video to the underlying scientific premise.


I watch the video and see the next generation of competitors who will be beating me at rocket league.


Thank you for your honesty.


I also teach computer science (part-time adjunct, I have a regular industry job too), so the quality of computer literacy in the incoming student body is on my mind pretty regularly. This has been especially troublesome when teaching operating systems, because the OS is the deepest layer in the pile of abstractions students have to peel-back over the course of their CS education. The pile of abstractions is only getting higher - the day is not far off when we have to peel-back the abstraction of "persistent data generally" in terms of files before we can get on to peeling-back the abstraction of files themselves. Or who knows, maybe we'll all abandon the desktop analogy of computing altogether, with it's hierarchies of files and links, to move on to some new purely-graph-theoretical database notion of persistent data. Somebody will still have to talk to the disk controller, however, and I can only imagine how distant that reality will be (and already is) from most students' day-to-day experience with computing.

At the same time, I can see a great equalizing force in all this. Computers have reduced the technical acumen demanded from their users to the point that owning a computer and using a computer and even being a "computer enthusiast" doesn't put you very much ahead of anybody else on-average when it comes to starting out towards becoming a computer scientist. I think this, in combination with that everybody wants to be a software engineer these days, will eventually put us somewhere around the 1970's in relative terms with regards to technical literacy in the cohort of incoming computer science students. This might sound nightmarish to the current establishment, but it has at least a positive side-effect: computer science is getting more accessible because teachers can no longer assume that pupils come from a background of "quasi-technical" computer literacy (again, this is because conventional computer literacy has become decreasingly technical in nature).

I've heard one of the general causes behind the lack of diversity in gender and economic-background in tech workers is, at some point around the mid 1980's, CS instructors started asking of their students tasks like: "Open your editor and type..." and someone in the classroom would raise their hand and ask "Ah, um... what's an `editor'? And by the way, I don't own a computer either" and the instructor's reaction would be to privately advise that student to seek a different major, at best, or open derision at worst.

So I think we're getting away from that, which is at least a way to look at the bright side. It does make the teaching job a bit more challenging.

On a more personal gripe, a minor irritation of late is the number of students who want to do their OS homework in the Windows Subsystem for Linux, instead of even setting up a basic VM.


> On a more personal gripe, a minor irritation of late is the number of students who want to do their OS homework in the Windows Subsystem for Linux, instead of even setting up a basic VM.

That's easy to solve, and can give the students a history lesson.

Give them some tasks that work with ~/con :)

Or you can have them work with files all named the same but differ with case. Windows based machines get "confused".


A much repeated myth regarding the Gros Michel is the reason that artificial banana flavoring tastes so "strong" to the point of being "obviously fake" to modern consumers is its basis on the flavor of Gros Michel, vs the much milder Cavendish they are accustomed to. This is apparently just a myth, but regardless I guess some say Cavendish tastes like artificial banana.

http://hoaxes.org/weblog/comments/was_artificial_banana_flav...


There was a 2012 documentary on the subject of dying alone with no next of kin, titled "A Certain Kind of Death": https://youtu.be/ErooOhzE268


I found it an amusing excrcise, if not terribly relevant, even as someone who spends 90% of his dev time in C.

What rubs me about these sorts of articles is they make some presumption about the importance and nessecisity of writing truely portable C, as if the "C Standard" were in and of itself a terribly useful tool. This is in contrast to where I live most of the time which is "GCC as an assembler macro language" (for a popular exposition on this subject see https://raphlinus.github.io/programming/rust/2018/08/17/unde...). And yeah, reading through the problem set I was critiquing it in context of my shop's standards, where we might be packing and padding, using cacheline alignment, static assertions about sizeof things, specific integer types, etc. So these sorts of articles just come off as a little pendantic to folks like me. I don't doubt they're useful for some folks, and I guess it's interesting to come up from the depths of non-standard GNU extensions and march= flags to see what I take for granted.


It's very much worth reading, Linus Torvalds' opinion of standards that's linked in that article, but I'll link it again here: https://lkml.org/lkml/2018/6/5/769

"So standards are not some kind of holy book that has to be revered. Standards too need to be questioned."

The way I see it, a lot of compiler writers are basically taking the standard as gospel and ignoring everything else "because the standard doesn't say we can't" --- and that's a huge problem, because behaviour that the standard doesn't define often has a far more common-sense meaning that programmers expect. IMHO the onus should really be on the authors of compilers to find that reasonable meaning. In fact, the standard even suggests that one possible undefined behaviour is something like "behave in a manner characteristic of the environment" (can't remember nor be bothered looking up the standard.)


This is a common misconception. Compiler authors don't exploit undefined behavior to make themselves seem smart, or because they like breaking code. They exploit undefined behavior because somebody filed a bug saying some code was slow, and exploiting UB was the simplest way--or, in many cases, the only way--to fix the performance problem.

GCC and Clang do give you the option to avoid optimizations based on undefined behavior: compile at -O0. We think of the low-level nature of C as being good for optimization, but in many cases the C language as people expect it to work is at odds with fast code.

It's fascinating to actually dive into the specific instances of undefined behavior exploitation that get the most complaints. In each such case, there is virtually always a good reason for it. For example, treating signed overflow of integers as UB is important to avoid polluting perfectly ordinary loops with movsx instructions everywhere on x86-64. It's easy to see why compiler developers added these optimizations: someone filed a bug saying "hey, why is my loop full of movsx", and the developers fixed the problem.

Edit: Should be movsx instead of movzx, sorry.


Could you go into a little bit more detail regarding the movzx? Aren't 32-bit registers always zero-extended on x86-64?


Sure. Here's an in-depth explanation from Fabian Giesen: https://gist.github.com/rygorous/e0f055bfb74e3d5f0af20690759...


Thanks, rygorous is always a great read - although sometimes a little overwhelming. If I got the gist of it, I have a small correction to your comment: the issue is about movsxd (sign extended integer indexes), not movzx (zero extension).


It's easy to see why compiler developers added these optimizations: someone filed a bug saying "hey, why is my loop full of movsx", and the developers fixed the problem.

"fixed" by breaking other expectations. Regardless of what the spec says, that's still a stupid way to do things. There's a child comment below which examines this case in detail; and the real solution is to make the analysis better, not use UB as a catch-all excuse.


> compiler writers are basically taking the standard as gospel

I would be rather disappointed if they didn't, honestly.


Consider the following statements:

1) The standard says I must do this, so I must do it.

2) The standard doesn't say I must not do this (but does allow me to either do it or not do it), so it's totally OK if I do it.

I think you're thinking of cases covered by statement 1, and I think pretty much everyone agrees that compiler writers should behave that way for the standard to mean anything.

The issues arise in cases covered by statement 2. Just because the standard allows a behavior doesn't mean that the behavior is a good one. And yes, code relying on you not having the behavior is not following the standard, and that's something the authors of that code should consider addressing. But on the other hand, the standard may allow a lot of behaviors that only make sense in some situations but not others (totally true of the C standard, depending on the underlying hardware) and as a compiler writer you should think carefully about what behaviors you actually want to implement.

AS a concrete example, you _could_ write a C compiler targeting x86-64 which has sizeof(uint64_t) == 1, sizeof(unsigned int) == 1, sizeof(unsigned long) == 2, and sizeof(unsigned long long) == 2 (so 64-bit char, 64-bit short, 64-bit int, 128-bit long, 128-bit long long). Would this be a good idea? Probably not, unless you are trying to use it as a way to test for bugs in code that you will want to run on an architecture where those sizes would actually make sense...


It's a collective action problem. If we want to give up runtime performance and get stronger guarantees about what code will be understood to mean, we should revise the standard and start using new optimizers that respect it. If every compiler goes its own way, I only benefit from what they already agreed on.


GCC and many other compilers have been known to change the consequences of undefined behavior unpredictably when upgrading, changing compiler flags, etc. For some examples that matters.


Knowing what the standard says and keeping to it as much as possible is important because every now and then, a major compiler finds some exciting new way to optimise code based on undefined behaviour, and breaks code that assumed GCC would always do some seemingly obvious reasonable thing it did when the author tested it.


If you use C as an assembler macro language, you aren't actually writing C. You're likely to get burned someday, unless you compile at -O0.


> as if the "C Standard" were in and of itself a terribly useful tool

Not necessarily, I took it to mean that engineering is holistic and things like compiler behavior in the face of undefined parts of the standard are important to account for.


[flagged]


Hey, please don't add personal attacks on top of your substantive points in HN threads. It helps nothing and makes the thread nastier and evokes worse from others. Also it's against the site guidelines: https://news.ycombinator.com/newsguidelines.html.


Where the author goes wrong is in assuming that somehow "I don't know" can be a final answer to these things. No, it is absolutely fucking vital that you know how the compiler will pad your structures in C. Similarly to the "what size is an int" on your architecture - on an ATmega8 this is 16 bit, but the chip can't actually do all 16 bit operations in single instructions.


I took that to be the point of the article though, that just looking at the code wasn't enough to know and you needed to go further to answer these cases for your exact use case or target platform.


Further: Unless your code is compiled, deployed to a rocket, and fired off the Earth never to return, the question of “what is my platform?” is meaningless in the context of writing good C.

So, today, using the compiler installed on your system right now, sizeof(int) = 32. Great. That means nothing, and changes nothing about whether your code is correct. You should not write code relying on it. Just like you should not measure the output of the questions on this test, and declare that you know what the answers are.


>Unless your code is compiled, deployed to a rocket, and fired off the Earth never to return, the question of “what is my platform?” is meaningless in the context of writing good C.

While I feel the tone of your comparison was intended to be a bit hyberbolic, the reality is a bulk of modern C development occurs in a context similar to the one you describe. Further the thought, utterly foreign to the vast majority of software developers, that the physical machine may not be some utterly abstract and constantly mutating target which there is no hope of understanding is, imo, one of the great dying arts of software engineering - a death perpetuated by the same sort of folks who think CS education should be carried on in Java.

I contend that, these days, most C is written to target a particular compiler, physical machine, and/or device.


There is vastly more old C code than new, and it didn't target the x64 or ARM architectures it's running on now. Where it wasn't portable, that was a defect that had to be fixed.

My first job was a 4GL targeting customers running DOS on the 80286, complete with runtime linking. 100% of that work has been abandoned due to incompatibility. It contributed nothing to the profession beyond what I personally learned.


There is a Mac program BBEdit that was first written to target 68K 32 bit Macs, then PPC 32 bit Macs, then 32 bit x86 Macs and then 64 bit Macs. Probably within the next 3 years it will target ARM Macs.

The author said he never did a full scale rewrite. He slowly migrated code from one platform to the next.

Today, Apple’s code runs on both ARM and x86 and with Marzipan, as will developers code. True most will be in Objective C, but some low level code is still in C.


I hope I'm being on topic and reasonable to point out that the result of the sizeof operator is in "number of chars", not bits.


This is why, decades ago, the C world moved on, and added types like int32_t and size_t, so programmers can say what they mean.


Ha, so I work in embedded too, and I was definitely one of the oddballs when I explicitly set-out to land a job in systems out of college. Now I'm a bit older, but I still remember my hiring manager's glum face light up when he asked me what I was "into" at a career fair and I replied with "C and Linux, FPGAs, etc." Now I go to career fairs, and I can certainly empathize - there really aren't a lot of new college grads who have any interest in systems work. In fact, 9 months ago we hired a new college grad and they left after about 6 months to go do some data science thing for an insurance company (sounds boring to me).

Something I've heard discussed in CS education, and experienced myself to an extent (I'm a part-time adjunct faculty teaching Operating Systems), is how the more recent generations of students are, in-spite of the ubiquity of computing in their daily lives, purportedly entering programs "less computer literate" than previous generations. I don't believe "computer literacy" accurately captures the nature of the nascent deficiency - it's really about "_systems_ literacy".

I imagine what's happening to "systems" ownership is a lot like what happened to car ownership between the 1950s and the 1970s - people forgot how to fix them because they got more complex and needed less maintenance. I think for some of us older folks who experienced early home-computing, this isn't all that counter-intuitive of an analogy. In the old days, to play a computer game, it usually necessitated some amount of "tinkering around" under the hood of the "system" - possibly changing settings, maybe you needed to install more HW, maybe you had to manually fix some corner case overlooked by the errant programmer. I've heard that people who were young adults in the early-80s to late-90's were in a "sweet spot" for systems - people had easy access to them, but they also had to "repair" (modify, configure, augment, etc) them a lot.

Today, we're trying to fill-in the sweet spot with things like the RaspberryPi and the myriad of similar educationally-oriented embedded systems / home computers, but somehow I don't feel like most of those capture the "frustration" factor - they're well documented, and pretty regular.

So, while it may be hard to find people in their 50's who can create a multi-platform responsive app using whatever "cutting-edge stack" Medium is swooning over, it's been my experience that it's even harder to find someone in their 20's who can write a device driver, or even a halfway-decent C program.


I left that space (writing code to drive biotech instruments) because, frankly, engineers were viewed -- and treated -- as cost centers to be minimized. Software companies ime tend to view software engineers as sources of competitive advantage.


I work in the media industry, which is in full panic mode because of the competition from Facebook and Google. They have viewed software as an investment, we’re hiring more and more devs but we’re slashing the newsrooms and ad-sales departments every year.


I think young people don't go into systems/embedded largely because of the pay. The median pay in those fields is just lower than in web/distributed systems. I for one would love to hack on a compiler (for example), but not at a 50% paycut.


Same here. I focused on embedded systems in school and had a related internship before dropping out to start an embedded systems-related automation startup. When I later returned to the workforce, and every time I looked for jobs afterward, I tried finding any kind of embedded systems job but always end up in web development where the pay is better and the hiring easier.

I like the teams I work with, but if any systems company can match pay relative to CoL and is looking for experienced engineers who know how to understand systems from hardware to human, look me up.


As someone who's "getting up there in the years", I really want to believe such a shift will come to pass, but somehow doubt it will, at least for my generation. Partially because '80s - '00s generation CS students were by an overwhelming majority upper-middle class males, and it's hard to generate sympathy for what are now mostly high-earning older men. As pointed out by others, it's "the bias in tech nobody wants to talk about." There's almost this sort-of "serves you right for getting complacent" kinda attitude around all of it. In general, the most sympathetic coverage I see about ageism in tech is through investigative journalism contextualized with riches-to-rags stories of the loyal Company Man whose American Dreams are dashed at the behest of an uncaring Corporate Goliath: "He had a family, a stable 6-figure job, 30 years of experience, and multiple advanced degrees, but now he's in the breadline". The perfect example of this is ProPublica's coverage of IBM's layoffs. Meanwhile, there is little recognition of the parallels between the purported "millennial mindset", sought-after and lauded by corporate HR departments, which holds "my job is my purpose, my life's work" and the sort of toxic "leetcode-ism" / "brogrammer" culture that is harmful to diversity in tech.

Anecdotally, the older folks who work a strict 9-5 are, unsurprisingly, often the most welcoming and "chill" people I've worked with. They also manage to get the same amount or more "real work" (i.e. fewer bugs, less drama) done in that same 9-5 block (again, this should be unsurprising when one considers the effect of experience). However, I also recognize these older folks have been almost overwhelmingly men. In my career thus far, I can recall working with 3 women over the age of 40 who were direct-contributors. Most corporate HR departments believe at least part of the solution to this problem is laying off "old guys" and replacing them with "young woke millennials," whilst totally ignoring that the "old" part of that is itself a bias and protected category. However, the protections are getting weaker (https://www.propublica.org/article/appeals-court-rules-key-a...).

Anyways, for folks looking for somewhat offbeat advice - as I get around to being an "old guy" myself, I've found that having dreads and a beard (and generally being of that "long haired freaky people" bent), while maintaining physical fitness, has helped a tremendously in masking my age. People are usually shocked to find I'm not "in my 20s or something." Of course this is sort of lifestyle-specific and not accessible to everyone, and I assume at some point I'll reach that "uncanny valley" with regards to my appearance and age, as mentioned by another poster.


Thanks! Finally some actionable advice in all of this, regarding the dreads and beard. Growing a beard now, too. On the other hand, I feel like misrepresentation doesn't quite get to the core of the problem ;-)


“Won’t someone think of the old white guys?” This very sentence came to mind recently. Sucks because we developers were the nerds, not the ones stealing your savings, housing, education, and healthcare.

There was a grey dread guy in the walking dead, Ezekiel?, is he cool with the young crowd?


Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: