Hacker News new | past | comments | ask | show | jobs | submit login
x86 Bare Metal Examples (github.com/cirosantilli)
513 points by Cieplak on March 19, 2019 | hide | past | favorite | 55 comments



Looking at the skills section on his webpage: http://www.cirosantilli.com/skills/

This seems like the most honest self-assessment I have seen of one's skills.


I literally wrote a book on Meteor.js and still have to check the manual for stuff all the time. The fact that engineering interviews are essentially roast sessions / pop quizzes is insane.

I interviewed with Snapchat a few years ago and was asked to write a Sudoku solver. Some context: I had never played Sudoku in my life, lol. I came up with a naive (unoptimized, unrefined) solution and didn't get past their third round.


That was copied from failed Google interview BTW :-)

To me the main problem is that I'm always afraid to put a 5, who can not check manuals? ;-)

I'm also more and more of the opinion that (impact of projects) is more important than (current skills).


I find this interesting because grade inflation is real. There are so many straight A students that it loses it's value as a measure. But if you can do something without looking it up I would agree that is proficient.

A good team member is a larger force multiplier than a single rockstar.


I have found inverting this kind of question helps.

Rate your interest in the following:

Item A Item B

People tend to express high interest where they want to grow, or have need.


> To me the main problem is that I'm always afraid to put a 5, who can not check manuals? ;-)

Quite often I find that the greater skills lies in knowing when and how to check manuals rather than expecting to be a knowledge machine.


> I'm always afraid to put a 5

I realized one day why I behave in ways like this. It's from working in a clear-cut environment that I treat normal communication literally. Someone once commented that I 'never lie' which I understood was much more true than false but ultimately still false.


Wow, that is how I've wanted to answer the "on a scale of 1-10, how would you rate your XYZ skills".

I really wish more people had some benchmark for the numbers they throw out there.


Similarly, if I see more than one or two 9-10 on self-assessed skills, I feel that's an invitation to "sock it to me" on knowledge questions when interviewing someone. When people are 4-6 I'm a lot more reasonable.


> When I tell to managers that I’m good at documenting, they always say: great, we need better documentation! But then, one of the following may happen:

I relate so much it hurts. The industry at large seems unable to see the lost opportunity of significant investment in technical documentation.


I'm kind of curious who's behind this Ciro Santili guy.

He(?) seems to be doing a ton of interesting github repos, a great deal of informative stackoverflow answers and who knows what else.

Also, some anti-CCP stuff in the profile?


If you have any specific questions, fire away :-)

Homepage + social media links from there should give a good idea of who I am: http://www.cirosantilli.com


Dang, while I'll appreciate your one compiled and one interpreted language way of life (pragmatic!), this makes me sad:

> Swift: I’m not an Apple person.

I'll admit, if I wasn't a professional iOS code slinger I probably never would have approached it, but at this point it's by far my favorite language to write (especially prototype) in.


I'm always interested when it comes up, especially when I see info on its progress for usability on non-Apple systems, but the last time I remember seeing info on that (multiple months ago, admittedly) was that it was coming along, but there wasn't quite parity with the Apple ecosystem and some (core?) libraries were different/not as good?

The impression I got (from comments of here trying to use it) was sort of an early Mono .NET type of situation (maybe not that forked, but still). It's hard to put any effort behind learning a language when it feels like you would be a second class user. I spent way too much time and effort being the guy trying out the experimental Linux support for projects inthe late 90's early 2000's, and I have less time now, so my explorations need to be a bit more directed.

It came down to Swift or Rust for learning a new language a while back for me, and I picked Rust for the reasons above. I've yet to do anything with it (and I'll have to brush up on it yet again when I do), but at least I don't feel like I'm getting second class support from the language.

I wouldn't mind being mistaken or having this already been addressed and Swift's port to other systems is at parity with Apple's. That would be nice. I wouldn't mind spending some extra time on it then, it does look to be an interesting language.


I'm on verge trying it. Would you like to share your opinion on why do you consider it good?


I am a big fan of Protocol-oriented development. It allows you to do component-style programming, where it's very simple (swifty) to extend functionality onto struct/classes in a easy to understand, reusable way.

Additionally, language features like guard/optionals/etc... allow you to deal with error states & control flow easily.

Two good videos I'd recommend on protocol-oriented development:

Protocol-oriented programming in swift (part 1): https://developer.apple.com/videos/play/wwdc2015/408/

Protocol and value-oriented programming in UIKit apps (part 2): https://developer.apple.com/videos/play/wwdc2016/419


I wrote C++ in college before I started with iOS/Obj-C around 2010, and 75% of my work from then until Swift launched was Obj-C, the rest was Java. So, just from a readability and writing standpoint, verbosity (or lack there of) was a huge win, however that's a bit subjective and more of an aesthetic reason.

1. I really like the let/var system, combined with the let/guard conditional assignments. It might not work for everyone, but I write cleaner code because of it. I also love optional chaining, and I've really molded my thinking around it, as it often encapsulates large chunks of logic in a single line (essentially, do we go past here or not, based on the state of the data and whether things are set/valid). It's null checking that doesn't feel bolted on.

2. Swift has made some big changes from release to release, sometimes breaking existing code, but my largest codebase is ~70k lines, and it's taken me at most a few hours to get rolling again (FWIW, the auto updater did not work for me on 2.2->3 I believe it was). That said, the changes are worthwhile. JSON (De)/Serialization built in via the Codable protocol was a big upgrade for me, removing a vast amount of boilerplate, as well as my reliance on a 3rd party library (although big thanks to NerdRanch for FreddyJSON, it served me well).

3. Speaking of 3rd party libraries, CocoaPods has treated me well. Easy to use, not too difficult to create your own custom libraries and manage them from your own git repos.

4. I know I don't use them to their full potential, but the higher order functions:

https://medium.com/@abhimuralidharan/higher-order-functions-...

are a real game changer. Those operations, combined with my own drive over the last ~5 years or so to write more tightly coupled, functional, code has resulted in far more maintainable, easy-for-humans-to-parse systems.

Granted, it's not all daisies and roses. I hate how it handles strings, and they can't seem to settle on an internal representation/manipulation mechanism. The safety of the whole ecosystem makes working with raw byte representations/pointers a bit of a hassle when you need to do it, but it isn't terrible/impossible.

I'm by no means an expert, and just by the nature of my work and my responsibilities (especially in other domains) I don't feel that I've had the chance to truly dig into the language for all it's worth. For instance, when I watched this video:

https://www.skilled.io/u/swiftsummit/server-side-swift-from-...

My mind was blown, and I didn't realize just how much I was under-leveraging the type system, and I hope to have some time to do a few personal projects to really integrate some of the more core pieces of the language into my workflow soon.

This is already huge and ranty, so if you have any pointed questions I'd be happy to take a stab.


Where can I find the list of words that will trigger the FW? Does this work with Chinese characters only or do Western words can also do the job?


Have a look at: https://github.com/cirosantilli/china-dictatorship/tree/00a2... The pinyin for most blocked characters / the Western name of events will likely also work.


Lots of good looking material in here. Thanks for sharing your knowledge.

Semi unrelated, what's up with the script tags on the profile page?


I did a bit of webdev in the past, so just casually looking for XSS on GitHub / websites that integrate with it.

I'm not a security person though, more interested in performance stuff.


I'm curious. Why did you learn Chinese, but not Rust or Haskell yet?

And yes, I agree with https://github.com/cirosantilli/china-dictatorship And that speaking Chinese is much simpler than reading or writing in it. One of the simplest languages actually.


Alas, I'm married to a Chinese woman, and not Graydon Hoare :-)


Interesting. I have to admit I have stereotype and prejudice, any westerner married to Chinese woman is pro-China. I see you have some sensitive Chinese word in username in different community, so I thought it is a Chinese guy in Western country behind it.


When your wife does Falun Gong, and her father spent a decade + in semi-prison during the cultural revolution, that tends to not bring out the brightest side of the Chinese Communist Party ;-)


I feel like this material is good for reference, but not so much for learning (especially if starting from zero with your first OS). I applaud the effort though.


Yes, this can be argued. Some of the links in the bibliography might be more suitable for that: https://github.com/cirosantilli/x86-bare-metal-examples/tree...


Ah, this guy! I found him through his stack overflow maybe a year ago, definitely cool.


[flagged]


Could you please stop posting unsubstantive comments to Hacker News?


I'll drop a question since I see Ciro Santilli is around. I want to start learning low level programming. Yet, I feel awfully overwhelmed every single time I try. Any recommendations you might have?


If you can choose still, study something that is related to laboratory work rather than programming, because anyone can buy a laptop and learn to program, but almost no one can access a lab.

If you can only program, first choose an application that you are passionate about, that is new, hard and important, and then learn whatever you need to achieve that goal. Application first, method later.

After that, if you still want to learn low level programming... :-) use emulators + minimal examples like in my tutorials, read / step debug the source of major projects and document that somewhere, and then try to modify those major projects with useful extensions.


Start with the book "Computer Systems: A Programmer's perspective by Bryant and O'Hallaron". Become fluent in C and its toolchains and you will be able to program everything (learn assembly only as needed).


Do you have any hello-world examples of how to load a 64bit ELF kernel with a GRUB2 bootloader? I don't mean Linux kernel, I literally mean a simple program that only prints "hello world". Something like this [1] but in long mode.

[1] https://wiki.osdev.org/Bare_Bones


Using BIOS calls doesn't seem to be really "bare metal". One could use bios interrupt to read data from a disk, or also one could use memory mapping with a target device to write a device driver that could request the file. I find the second case a "bare metal" approach.

Fortunately most of the bios examples are using bios_* filenames. Rest of the files are very nice.


Yes, here we go again: https://news.ycombinator.com/item?id=18532102 :-)

I wonder how much lower level you can go with QEMU / how much it can match real hardware.

Pull requests / links welcome ;-)


QEMU runs the 'seabios' bios image in the guest, so you could certainly run a 'bare metal' custom image instead of the default BIOS blob if you wanted. At the basic level of "prod the UART, prod the timers" we should be a reasonable match to real hardware. Running on real hardware without the BIOS would be trickier as you start to need to do things like set up memory controllers, which you can get away without on QEMU.


Okay I've sent you a pull request with a bare metal hello world example ;)


One could follow this "toward the bare metal" way as long as he wish. E.g. in Silego GreenPAK chips a CPU is considered unneeded abstraction and you program the raw state machine.


What is the use-case in real life ?

Is it to run special lab/factory equipment ? Anyone here who develops for bare metal - can you share any details?


This repo is mostly just educative as a helper to understand Linux kernel x86 / Linux kernel drivers

In e.g. ARM, baremetal is potentially more useful due to embedded. But even in ARM you should just use Linux kernel / some RTOS if you can get away with it :-)


The use case is to break some layers of abstraction. If no one would write such a guide, who would create the next OS? We would all be dammed to copy existing codes and pray they work.

I applaude the author for the effort, in particular using images that can be booted should make it feasible to use these techniques for teaching!!


A dream I've had for a long time is to make an application that you boot into, which is the only application on the computer, for maximum optimization. Upgrades would be easy, just reboot the computer and load the new software. Wouln't write everything in assembly though.


What you describe used to be very common: https://en.wikipedia.org/wiki/PC_booter


The problem comes when you need to use any kind of hardware, all the drivers would have to be part of your app. The "unikernel" concept however does some version of this: loads a kernel which only supports the minimal amount of functionality and drivers needed and with only a single address space for a single app.


Historically this was more tractable because driver code was much simpler and there was a lot of cloning/emulation of "legacy" hardware interfaces (a lot of this is still present on PC, despite vendors' desire to get rid of it). Typically either the hardware capabilities were fundamentally narrower in scope than modern hardware or the hardware had its own controller (sometimes with more raw power than the host CPU...) that implemented an abstract interface. A bare-metal VGA driver for a single mode is no more than a few dozen lines of code, whereas a modern GPU driver stack literally has an optimizing compiler built into it.


This is standard "embedded" system programming for lower-end MCUs. "Unikernels" use the same concept to create standalone bootable binaries(linked with reqd. OS modules) to run directly either on HW or on a VM Hypervisor.

See "IncludeOS" for one representative example.


Yes, the ASM overkill was a mistake of my youth :-) https://github.com/cirosantilli/x86-bare-metal-examples/tree...


Besides what everyone is commenting on, this is how Amiga, Atari ST and most console games used to be done.


So a Unikernel with your program?


you're describing any number of embedded systems, including the Arduino runtime - load up an Arduino app and you have your dream come true :-)



You can use efi text mode or framebuffer these days.


I wanted to do a proper UEFI example, but got lazy, sketch at: https://github.com/cirosantilli/x86-bare-metal-examples/tree...

Links to minimal runnable working examples / patches welcome.


"\n\r" should be "\r\n" (0D, 0A -> CR, LF).


Do you mean at: https://github.com/cirosantilli/x86-bare-metal-examples/blob... ? Does it make any difference in BIOS, where \n seems to be "move cursor down" and "\r" seems to be "go to start of line"?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: