Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: What is the most complex topic you understand?
44 points by ggeorgovassilis on July 13, 2022 | hide | past | favorite | 92 comments
I asked the same question last year [1] and received fascinating answers - I'm looking forward to your new answers.

[1] https://news.ycombinator.com/item?id=26492180




I'm not sure there's anything that I understand, at least not fully. There are things I know a certain amount about, but probably nothing that I understand as fully as I'd like.

Also, understanding can be fleeting if you don't think about a certain topic all the time. Take relativity for example: I recall a very specific moment when I was talking to a friend who is actually formally educated in Physics, and we were talking about the speed of light, time dilation, mass expansion, etc. And I had a definite "aha" moment when I understood why mass has to increase with velocity and why time dilation had to occur. But... could I explain it to somebody else now (that was probably 10 years ago), and convey the same understanding? Do I even understand it as well now as I did in that moment? Arguably not. :-(

OTOH, if you just ask "What's a topic that you know a lot about?" I'd say, generally speaking, "firefighting" and "computer programming, especially in Java". shrug


I was tought special relativity and just "accepted" it because experiments confirm it. During a prolonged period of idleness I sat down and derived all formulas from the underlying Lorenz transformation (basically the pythagorean theorm). I think that helped me understand special relativity.


I guess it might be 'graph hashing', also known as canonicalization.

So this is related to some other things, like the Weisfeiler-Leman Algorithm, and Morgan numbers, and partition refinement. The idea is that to compare two graphs (networks, or Eulerian graphs) for equality one way to do that is to 'hash' the graph. The resulting canonical representations of the graphs are then compared directly for equality.

I implemented a couple of different approaches for doing this - one based on a research paper calling the technique 'signatures', and the other based on an algorithm from a book called 'C.A.G.E.S' (can give references in a bit).

For me, the process of understanding how to achieve this meant learning about permutations, group theory, partition refinement, applying permutations to graphs, and so many other things. I think all told it took me more than a year to properly understand and implement it all!

Very fun, but maybe not the best use of time, looking back...


In a distant past I started working on something similar, but reached the limits of my mind before doing anything useful. Tangential question: is there a difference in the context of graphs between canonicalization and normalization?


The simple answer is that I don't know! I never looked at 'normalization' in graphs.

From a quick google, seems like it has been used in a couple of contexts. Firstly in GNNs which I definitely know nothing about.

However I see another description in terms of RDF triples, which sounds like a canonicalization algorithm (https://json-ld.org/spec/ED/rdf-graph-normalization/20111016...)


Thank you


Germany.

Though I have to admit an ever increasing number of details are getting fuzzy as they fade into the rear-view mirror of my attention.

As a foreigner fluent in German, with a very diverse set of German friends, and years of experience living there, in three very different regions at different stages of my life, I will make the bold claim that I understand Germany, even if I probably understood it more in the past than I will in the future.

I’m sure there are many people who understand it better, deeper, more completely; but I’m not sure any of them are Germans. Ask the fish about water, etc.

I think this kind of sounds like hubris. I plead guilty, but with the mitigating circumstance that my claim to understanding isn’t any more hubristic than the next guy’s.

Austria, on the other hand, is a complete mystery to me.


I wouldn't go that far, but I understand where this comes from. To some level and in some ways, I can confidently say I understand where I live (Tallinn, Estonia) more than most of my friends.

I moved here almost 6 years ago and I have been enthusiastically trying to follow and attend events, follow news, travel inside the country, discover new restaurants/bars/museums/activities as much as I can. As a result, many of my local friends have told me that they have never been to a city or venue that I mentioned.

OTOH, as most my friends are between 20-40 years old, they have that much experience inthis country and there are many things they know that I don't: like deep command of language, local history/experience (especially in soviet-post soviet era) books/literature, nature, handicraft, songs, etc.


Would you mind explaining a bit what you mean with understand Germany?

I am German and my girlfriend is from Brazil. We talk a lot about cultural differences and how Germans are. So I am curious.


What are some major issues/concerns that you see for the present or near future of Germany?


Rather than "understand" I'll talk about complex "accomplishments".

I once built a computer. I had Alan Clements "Microprocessor Systems Design" for an undergraduate text and built a 68000 based machine with 64M of RAM and ROM. Using discrete ICs on two big Eurorack wirewrap cards was hard work and needed good eyesight and dexterity. Then I wrote a bootloader and kermit-like like RS232 program loader, and finally hashed together a really basic OS from Tannenbaum (with a synchronous scheduler and fixed job table). A long summer of late nights. Can I say "I understand computers". No way! But I feel confident I know more about them than most people I'll ever meet.

Another heroic adventure was "Linux from Scratch", compiling everything along the way with tinycc.

These things are rites of passage. They don't mean I understand those things, and even if I did, that knowledge is obsolete today, but I'm glad I did them.


Amazing. Do you have some docs / resources on this online? How would you go around building a computer out of commodity components today?

Do you think compiling everything on own computer results in noticeable increase in performance? Say, distribution provides binaries built on Intel, but I want to run on AMD, does it make sense to recompile binaries on the AMD?


> docs / resources on this online?

Surprisingly the old textbook went to three editions, the last in 1997 [1] It looks as if it may be on archive.org [2], and is definitely in Google Books and LibGen.

> How would you go around building a computer out of commodity components today?

Good question. You could of course retrace my steps using 1980s technology, but I wouldn't advise that. The problem today is physical scale. Most components are surface mount and you really need to build multi layer boards and use a bench microscope to place and solder. Getting anything hand-built to reliably run faster than 100MHz without knowing what you're doing is no fun. Everything I spent a thousand hours building is in a SoC for $1 now.

> Do you think compiling everything on own computer results in noticeable increase in performance?

That's a question to ask Gentoo/Arch users (and BSD people who use Ports). I think yes, because hopefully you'll optimise for the architecture you have. But is is worth it? Given the many hours you'll wait, maybe not at my age :) Do it while you're young and can learn cool things from it.

[1] https://www.goodreads.com/book/show/950048.Microprocessor_Sy...

[2] https://archive.org/details/microprocessor-systems-design-68...


There's also the excellent https://www.nand2tetris.org/

From basic computing gates all the way to running tetris. It taught me so much of the lower levels.

Also a comment as a former Gentoo user of 7+ years - the performance is there - 5-15% maybe. I honestly don't think it's noticeable, it never was to me, but what you learn using Gentoo has served me very well.

I moved on as well. I'm too old to deal with fixing things and all the waiting, I just want it to work haha


Thanks, I was disappointed first that it's only a virtual computer in a simulator, but then I found out there is a real physical realization, using an FPGA board:

https://gitlab.com/x653/nand2tetris-fpga/


Thanks.


Healthcare accumulators & benefit plan designs : embedded coverage vs non-embedded coverage, drug tiers, medicare d coverage stages, deductible, out of pocket, formulary, preferred/non-preferred, drug specific caps,

Essentially all those instruments healthcare companies invented to make coverage and payout rules complex.


The Prime minicomputer architecture, from the 80's. Competed with Dec VAX, Data General Eclipse, etc.

The Prime architecture was based on the Multics design from a group of guys hailing from Honeywell and MIT. I worked on Primes for around 12 years, worked for Prime briefly in the 80's, and wrote an emulator that runs many versions of Prime's OS, Primos: https://github.com/prirun/p50em

Prime systems are very much the opposite of RISC, with instructions for:

- process exchange and task scheduling

- procedure call, incl arguments, ring crossing, stack handling, register saves

- decimal arithmetic, incl picture editing for COBOL & PL/I

- string instructions and picture editing

- 6 CPU modes for compatibility with older processors

- DMA, DMC, DMT, DMQ I/O modes for controller access to memory

- 32, 64, and 128-bit floating point


Bash*

I got a book published with a bunch of community and personal findings as an "I wish I had known this earlier" kind of collection.

* Giant disclaimer: Bash is insanely complex, mostly for historical reasons. There are infinite ways of doing basically anything, and all of them have different trade-offs. The big one usually being between the complexity of the code and the kind of situations it can handle.


Nothing. There's nothing that I fully understand because there's always another level deeper you can go.


I agree with the sentiment if the "another level deeper" is still the same topic. But, eg., I can be an expert in performant software programming without understanding hardware in depth. Understanding roughly how hardware (eg. word alignment) affects software performance is enough - no need to understand how signals propagate in a conductor.


You might be able to write performant software, but if you can't answer why it works the way it does (requires hardware knowledge), then I would say maybe you don't fully understand it. I think it's impossible to find someone who knows everything.


Not OP, but I think the question is not about the thing you fully understand, instead, amongst the things you know/understand, which one is the most complex compared to the others.


I understand software engineering and DevSecOps well enough to keep a number of applications in production use, but both subjects are so broad that I think anyone who tells you they fully understand them either doesn't know how much they don't know, or is selling something.


I don't know, because once I understand something it always seems very simple to me.


Then try a fun game: explain it to someone else, preferably outside of the problem domain. The premise you're capturing is great, because society benefits from your hard-won battle to distill that information into something that can be passed along to "future generations"

That's why I have the greatest respect for high quality documentation because a lot of energy went into crafting it


That's a good recommendation. On the other hand, there's an added difficulty for me to explain what I know because it's all in English and my native language is Greek. When I try to explain e.g. the subject of my PhD studies to my relatives or friends, I find that I simply do not have the Greek vocabulary to do it.

I started translating one of my papers in Greek just to address this limitation, but I guess I got bored and gave up midway.

I'm fine with explaining it all in English though. At the risk of inducing terminal boredom :)


I came here to say this. Once I get something, it becomes too obvious.


I understand a few complex topics, but I often fail to explain them on the first few attempts. So I realise that certain parts of the topics move in and out of my "working memory", but they are too large for me to grasp them entirely at any point in time.


Sigh. I don't believe this aids the discussion in any meaningful way at all.


To me this is the very crux of the conversation.


Cannabis strains and how CBD can work as a governor to suppress the psychological effects of high THC percentages. How the brain receptors react and trigger different states.

The science of recreational drugs are interesting to me.


That seems like unlocking the next level of hacking one's own consciousnes. I understand why this is interesting.


English local government form, function and finances - or at least I used to, 15 years ago. For my sins, I worked as part of the Lyons Inquiry[1][2] team for 2.5yrs.

... And no. It's not all my fault.

[1] - https://www.gov.uk/government/organisations/lyons-inquiry-in...

[2] - https://en.wikipedia.org/wiki/Lyons_Inquiry


As others have said, I don't think I fully understand a whole lot! But let me think of it this way: Of all the topics that I know stuff about, which one do I think has the biggest delta between what I know and my co-workers in the same space? I'd say cryptography. I'm not an expert by any means, but even among my co-workers, I find myself explaining a lot of basic concepts before even getting into the gritty.


To the extent that I understand anything, which is to say with all humbleness, incompletely, I'd say the most complex things I understand are: 1. human behavior 2. supply chains (arguably a subset of 1.) 3. distributed software systems


It’s a bit hand wavy but conceptually: Hot rock to running application is mentally wild.


I got good at understanding how rocket engines work and the design considerations simply by watching everyday astronaut and Scott manly videos. But now that I understand it, it really doesn’t seem complex.


Methods of self-organizing DHTs to optimize lookup performance. I had to learn hyperbolic geometry and figure out how to compute Delunay/Voronoi graphs on it, at scale, distributedly.


What did Stross do to you?


Make his books available (in a public library) to an impressionable 16 year-old who then set out to become Manfred Macx. One PhD and software engineering career later, I still blame him.


I've repaired multiple HP 5061 Cesium Beam Atomic Clocks. I've got a very good handle on how they work. I'm fairly good at explaining it to people as well.


My girlfriend. Of course it's not a complete understanding, and there's always new things to learn. But that's part of the wonder and enjoyment.


Humans are incredibly complex animals, you'll never be able to fully comprehend a person (not even yourself), so take comfort the fact that this is what makes us unique and interesting. I personally love being surrounded by fellow imperfect, vulnerable creatures on a tiny planet at the edge of a galaxy..


Oh you are so wrong.

-- The girlfriend


Genome engineering and how our agricultural system actually functions. Next step is to have an in-depth understanding of the entire value chain of agriculture.


> how our agricultural system actually functions

I take "actually" to mean that your understanding differs from that of most people. Care to tell the highlights?


Most discussion I see around this topic lacks nuance which exists at every step of the value chain.

One example is the dichotomous nature of regenerative agriculture and GM crops. The average person generally takes a very argumentative position on this one, when in reality the two are not mutually exclusive.

Another example is how the industry operates in terms of corporate influence, academia, and allocation of funding. Which leads to the discussion of subsidies, staple crops, livestock feed, GHG emissions, and how to realistically transition to a more sustainable system. When folks talk about GHG mitigation, they often get it wrong. There is some great literature available on this topic in Nature Food Journal.

But in general, I wish people wouldn't reduce the topic to a wedge issue. Having a firm understanding of the entire system and its nuance is the only way to come up viable solutions.

I'm happy to try and answer any specific questions that may come to mind.


Thank you


How avoid making my wife unhappy (…most of the time)


How CDMA, eHRPD and LTE backend systems design and deployment. My other odd ball skill is knowing old PBX/Key based phone systems.


The electrical properties and strength of covalent molecular bonds during DNA splicing

Also microinstruction vulnerabilities of CPUs.


Those are two very different areas. How did you come to learn about each one?


On my own. I’m also analyzing my copy of my very own DNA strands.

What’s harder is finding a sequencing company that will respect your privacy and actually do so. It helps to have a friend in biochemistry.

That’s the hobby. The real paying work is vulnerabilities.


Semiconductor fabrication process flows. How to start with a bare epitaxial silicon wafer, then process it step by step, until it leaves the fab.


The answer to life, the universe, and everything.


I don’t understand it. I know it, but I don’t understand it. To truly understand it I’d have to know the question and that eludes me no matter how deeply I think on it.


Software programming. Probably not surprising, considering the discussion board I'm posting on.


Nothing.

Some people don't understand things, that is not the same as understanding nothing.

I also understand ashtanga and programming.


The most complicated thing you understand is nothing? As in nothingness? Is that a zen thing? I can tell you one thing I don’t understand. I don’t understand what you’re trying to convey.


Maybe I am just trolling a bit because I saw a bunch of these kinds of replies and I wanted to subvert the trope a bit.

But I am referencing daoism rather than zen.


FAA Certification


Developer experience and managing complexity


- Regular Expressions

- Unicode and characters encodings

- Native WebGL

- CSS3D

- Women

I lied about women.


EM theory. Ask away...


Jira and its schemes


topic: How we understand things and etc.


Monads.


My wife.


FFT...?


Good for you


well... programming


Hmm, how do you know you understand programming?

I have programmed for 30 years, more than 10 in the area of embedded Linux, mostly user space, but also kernel and boot loader. My code has been running in at least a million devices in the field. However more often than not when I need to look into systemd code, I have the feeling I have no clue how this stuff can possibly work, have I ever learned to program?


I would argue that this ‘humble; always learning’ mindset is exactly what enables experts to become experts but ironically causes experts to feel like they’re not experts.


Yes you did, and of course you have a deep understanding. However, nobody knows everything about something, even if he studies that something for a lifetime.


data sync


Entropy.

It was hard for me to understand this arbitrary rule of things becoming less ordered over time. Was this just a fundamental natural law?

The answer is no. Entropy is a logical consequence of probability and time.

Why do things become more chaotic over time? Because chaotic configurations have a higher probability of occurring.

There are far more disordered configurations of things then there are ordered, this is why things become more disorder with time. Time changes the configuration. And by probability a high probability configuration is more likely to occur then a low probability configuration. So the axiom of nature here is not entropy, it's probability. Entropy is just a consequence of probability.

There are systems where ordered configurations are more numerous then disordered configuration and in those systems things become more ordered with time. In these cases entropy is STILL defined to be increasing as things get more ordered. Thus entropy is not describing disorder, or is it?

The thing I don't fully understand yet is heat entropy. Apparently when you take heat into account for everything, the disordered intuition suddenly becomes applicable. So if you have a system becoming more ordered with time, heat must be increasing somewhere to offset this increase in order. Maybe someone can explain this part to me?


> The thing I don't fully understand yet is heat entropy. Apparently when you take heat into account for everything, the disordered intuition suddenly becomes applicable. So if you have a system becoming more ordered with time, heat must be increasing somewhere to offset this increase in order. Maybe someone can explain this part to me?

You can extract work from order, and can create order with work. Consider this scenario:

We have two kinds of particles and two rooms. If we have one kind of particle per room it is pretty ordered, if we mix everything then it isn't ordered. Now, lets say we have two filters, filter 1 lets particle 1 through and filter 2 lets particle 2 through, but not the others. So we have a wall between the rooms made of these two filters. Particle 1 will just put pressure on filter 1 and vice versa. This way we can let particle 1 push their filter through room 2, which creates work, thus mixing particles a bit. Do the same with particle 2 and now we have mixed both and extracted the work from mixing the particles. We can reverse this process by moving the filters the opposite direction, dividing the particles again and this process will require work to perform.

Exactly how the filter work doesn't matter, as long as it lets through the other particles then the other particles will stabilize and eventually even out the pressure on the sides. It could be slow but it would work. Doesn't have to be perfect either, if probability isn't exactly the same for both particles you can do it over and over until you get the purity on each side you want. We already uses this to enrich uranium for example, so we can make filters for basically anything in theory.


Ok, but notice this argument is based on purely probabilistic assumptions and that all micro-states are equally probable to occur and do occur in time. There is still the question whether these assumptions are valid in a given physics system.

In physics we know much more about evolution of systems, for example in classical mechanics model of gas, there are deterministic equations of motion which also conserve some quantities like energy. So not all thinkable chaotic states are accessible, and the evolution of state is not a repeated random choice out of the whole set of states.


I envy you for your understanding of entropy. I understand a few simple aspects of it, but the deeper meaning eludes me. It does seem to be a useful concept which grants access to so many knowledge domains.


I understand one aspect of it in terms of probability. However another aspect of it eludes me as well, as I described in my last paragraph.

There's also a third angle from information theory. This type of understanding I haven't really studied in depth yet.


> Because chaotic configurations have a higher probability of occurring.

That seems like a non-explanation though. Why do chaotic configurations have a higher probability of occurring?


There are way way way more of them, so if you randomly select from the set of all possible configurations, you are much more likely to get something spread out and chaotic then something with a recognizable structure. As atoms and molecules bounce around, they are effectively randomizing. After a few dozen interactions each, they're basically in a new state. Repeat over and over and you're basically just drawing from that same set again. There's a chance that all of the air molecules could bounce to one side of the room at once, but the likely hood of that makes it something you won't see in the lifetime of the universe. So if you see a video where all the air rushes from half the room to fill the rest, you can be effectively certain you're watching the video forward and not in reverse.


From other replies to your question I understand that you use "chaotic" in the mathematical sense while other commenters use it in the colloquial ("random") sense.


I use it in the colloquial sense. Disordered, random.


Because, as he said, there are more of them.


12345 is less plentiful then 32415 or 32154 or .... What we considered "ordered" has less possibilities then "unordered" not just in a string of 5 numbers but for most systems. That should help you get the intuition. Lmk if you need more elaboration.


Master that entropy and you too can enjoy sending data over Internet without changing one bit of data: pulse-width modulation of inter-packet delay gaps.


women


Please use this thread for truthful answers only.


Damn, didn't realize Chabuddy G is on hn


I see how this is funny xor insightful.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: