I highly recommend "21st Century C" (http://shop.oreilly.com/product/0636920025108.do) as a tour of modern C programming, including use of C99 features. It goes over C tooling (profiling, debugging, testing, cross-platform deployment), and explores C99.
Although GP didn't remark on it, TFA uses an other cool feature which is actually C11: anonymous unions. Although not as wonderful as designated literals, they're really quite convenient.
The GNU modes just enable additional features which conflict with plain C. Any feature which can coexist with plain C will be enabled in the regular C modes too. For example, __typeof__ is enabled everywhere, because the compiler is allowed to do just about anything it wants with a __ prefix, but plain typeof is only enabled in the GNU modes, since it could cause valid C programs using "typeof" for their own purposes to fail to compile.
When I was in university, I did something similar in Python. I printed it off in color and brought it to a career fair. One guy loved it excitedly. But I didn't hear back from him :(. And that's the story of how I cam to work in Java ;)
This is why when you meet someone at a networking event:
0) Have brought a pen or cell phone.
1) Always get their business card or contact info.
2) When you get someone's business card, always write down where you met them and what you want to ask or show them. Otherwise, you end up with a pile of cards and no memory of who they were from.
2.1) If you actually had them type their email into your phone, sketch out the email, but don't send it.
You input arguments into your program using argc and argv. argc is number of arguments and argv is the list of arguments.
There is no string in c so you use char pointers instead. But in previous code he used char, which is usually defines a single string, but main actually inputs an array of string so it should have been char* (pointer to pointer of char, or pointer to 'string')
Nope, neither will segfault on the initial access.
char ** argv
is a null terminated array in C according to the C standard. (EDIT: see clarification below) C doesn't care about types at all and you can abuse this by casting pointers to different (sometimes incompatible types). The second argument to main (argv) it set up by the initialization code and has the same space as a pointer regardless if it's a string, array of strings, struct, integer etc...
The existing structure will simply be interpreted as that new type like so:
We are talking about C and not a specific compiler, so there are a lot of mistakes in your comment.
char * * is a pointer to a pointer to a char and nothing else, it might be a NULL terminated array or not at all.
C cares about types. Casting a pointer to an incompatible type or reinterpreting through an incompatible pointer is not defined in C( undefined behavior ).
Pointers of different types are allowed to have different sizes. sizeof( char* ) == sizeof( char* * ) is not guaranteed in C.
Also any program whose main is not int main( void ) or int main( int , char* * ) will result in undefined behavior.
All of this is in the latest standard.
Both, segfaulting or running completely normally can be a result of undefined behavior. That is why we really like to avoid it in C. Thus your advice is really not good for a modern C.
In the case of argv though on main it is defined to be null terminated on argv[argc] according to the standard. Assuming the pointers are the same size though what I said will work but is bad practice.
> sizeof( char* ) == sizeof( char* * )
True, I forgot that different architectures/compilers can produce different pointer sizes. Thanks!
I think we have bigger problems to address before we get to these minor contrivances.
This resume is not web scale enough. We need to wrap the resume in node.js so it can be non-blocking for big data. It should be written in an esoteric JS variant and transpiled to ES5 because everyone knows ES5 is unreadable. Oh and the storage of his education and job history should really be stored in a NoSQL database like MongoDB that way he can shard asynchronously. When he presents the resume in his interviews it should be wrapped in a webkit container.
* Avoid using NULL in your resume. It has a negative connotation, and makes your resume look bad.
* Make use of as many existing libraries as possible, to show that you're not the kind of programmer that wants to invent the wheel on every occasion.
* Try to obfuscate your resume a little (but not too much!) As it is written here, your reader can easily guess what the program will do, and he/she will not even want to run it anymore.
* #include a portfolio of your work inside your resume.
He's already iterating by using null termination instead of based on array bounds. I am suggesting he do it in the more common way.
On another topic, I have noticed you post this sort of trollish c bashing on almost every hn thread about c. It's fine that you prefer something else, but maybe time to give it a rest?
You're not alone in that a lot of people here are not shy about expressing that they feel more productive working in languages with GC and bounds checking. That's fine.
I happen to think that a good understanding of C has made me a much better developer. I don't think your opinions, however strong, genuine, and coming from a real place, invalidate that.
What I don't appreciate are the attitudes that C is uninteresting, that nobody is productive in C, etc. Or that it is phrased as a moral duty to avoid C. I feel like it's too common to overstate the dangers. Safe C is still possible.
I think Peter Van Der Linden would agree. He states in Deep C Programming that the separation of the compiler from the linter, and I assume by extension semantic analysis tools as well, was a mistake in retrospect. It's good to see that compilers like Clang are bringing them back together in a way.
One does have to be more careful when writing C but I don't "blame" C for unsafe code. It's the kind of language that assumes the programmer know what they're doing. In this regard it is very unforgiving.
I don't understand your false premises. I've never had C marketed to me as a language that promised safe, perfect code. It does what it says on the tin.
However I think you have some false premises: the Linux kernel is huge, has a fairly high turn-over in contributors, and is written in C. Hardly anyone "knows" the entire code base. I've yet to meet anyone who can even name all of the compile switches... some 1000+ of them. Yet it's hard to argue that it's not useful or impossible to contribute to. I'm not a genius and if I can figure it out I'm sure I can teach others.
Yes, you can write bad code in C. I don't think you can blame C for human error. There are sins of commission and those of omission and the ANSI C specification documents them both quite well. Designing a language that protected an ignorant and uneducated programmer from making mistakes wasn't one of their goals I'm afraid.
While the separation of lint from the compiler might have been a mistake made as an early performance trade-off I don't really see how that was a false premise of the philosophy of trusting the programmer. Given the history of trade-offs made in the name of performance it doesn't seem like correctness and safety were big concerns.
Maybe they are now and that's why I find the Mirage project interesting... but C still has its uses and I don't blame C for human error. The specification isn't terribly difficult to digest and the tooling is rather good these days.
It's subjective. It feels more c like and natural to me to write while (foo->bar) and not while (foo[i].bar). The latter feels like the way you would do it somewhere else, grafted onto C. But it's not wrong, I don't have deep qualms with it, and that's why I put it last on my list.
It's not subjective. Intel's guidelines for writing vectorizable code:
Prefer array notation to the use of pointers. C programs in particular impose very few restrictions on the use of pointers; aliased pointers may lead to unexpected
dependencies. Without help, the compiler often cannot tell whether it is safe to vectorize code containing pointers.
What are the odds this turns out to be practical advice for this example? Are you going to write all your loops like that because someone at Intel told you it was a good idea, or are you going to measure it when it's seen to be a problem?
The poster did not sum values. They used the loop to call into printf. I'm guessing they didn't also use those particular compiler options... What you have here sounds like a lot of tuning that does not apply equally for all scenarios.
I would have to agree. IMO it lends itself to cleaner code if you're working with NULL terminated lists, especially if you use a `for` when doing the iteration instead of a `while`:
const char **a;
for (a = job->accomplishments; *a; a++)
printf(" - %s\n", *a);
And
job_t *j;
for (j = jobs; *j; j++)
print_job(j);
The person who wrote this code seems to dislike `for`s for some reason it seems, these are pretty obvious places to use one IMO, using an index `i` or not. Doing it with a `while` separates out the initialization, condition, and increment in his code, and 'continue' won't work like you normally want it too.
This is a hit or miss. Say they are looking for a C coder. A bad recruiter (most of them) would just discard this as he/she wouldn't even recognize it as C code, just some junk not fitting into his/her template.
A good recruiter though, or if a recruiter wasn't used - they would see this as smart and creative and definitely get you an interview.
I've seen companies shutting down/relocated because of bad recruiters filtering good programmers like this.
Personally, I wouldn't want to work for a company that uses bad recruiters (and thus probably hires bad programmers), so something like this would be doing its job.
You don't have to go this far. My CV is a HTML page, and I just refuse to convert it to Word. It hinders the recruiters who just like to buzzword search their database.
It's cheeky, but then again it is a project you can show someone. Graphic designers get to make fancy schmancy resumes to impress their potential employers, so why not programmer? Better make sure there are no bugs in it though.
I did my entire portfolio as a C++ project at one point. Downside is that recruiters and managers looking at my site thought something was wrong on my server and they were seeing the code for the site.
For bonus points (or not, depending on the recipient's sense of humour) construct a resume that would be suitable for the underhanded C contest (http://www.underhanded-c.org/).
that would be an interesting way for malware to enter - you submit a seemingly innocent resume in C, ask the reader to compile it to see it in pretty colours/formatting. And subtlety pwning the machine in the process.
There's an argument that using a 'for' construct groups together the details of the loop (initialization, the loop condition, iteration update), which can aid readability.
(You don't have to search the loop body to find how things change between iterations)
Of course, nothing forces a for() loop to have iteration behavior entirely dependent on only what's in the 'for'. In my experience this seems to be discouraged, however, preferring to use 'for' only when there is a predictable and simple iteration pattern.
In the end, it's just a matter of preference and style. :)
This is great fun, which comes along at a nice time since I'm working through an intro to C class to revive some very stale "real language" skills I've lost years ago. I've studied through some basic dynamic data structures (linked-lists, binary trees, etc.) and was quite surprised at both how much of this I could follow AND how much I learned.
That's pretty great. It's too bad most HR tools would completely screw that nice formatting, but I'd definitely bring printed copies of that into a dev role interview! That'd break the ice right away.
Great idea, but oops... the data structure thing_t is hideous. It is a "god object" for anything where differently purposed members are union'd. Also, putting char*'s into a union is a terrible idea.
> It is a "god object" for anything where differently purposed members are union'd. Also, putting char*'s into a union is a terrible idea.
They're not differently purposed, the union is just used to define more contextual labels for equivalent fields, and thus make initialisation more readable within each domain.
There are three data structures unsystematically blended into one (that is, one cannot unambiguously tell which fields go into which "virtual" type, and cannot formally check the correctness). The name thing_t is indicative that you indeed cannot really tell what that entity is. This kind of data structure design begs for errors, while being conceptually wrong.
I get your point, but how to tell "labelling" from typing? Typing depends on semantics. This particular example only works out because the types are all strings, which is a case of sort of loose typing often used in programming. It could've been like this:
char* points to some mutable buffer of char's that is supposed to be allocated somewhere. In this particular case they are assigned string literals, but in general it would be quite easy for someone to break consistency of allocations/deallocations in such code.
OK but that's an issue with using char* where does the union issue come in? I could understand an issue with a union between char* and, say, int or some other pointer, but I don't see what specific issue would arise from a union between the exact same pointer types.
The difference is that the code may try to execute x = "Hello", and then call free(y) (in case the condition is messed up). So the code will try to free() the pointer to a string literal (because they share the same memory location in the union).
The point I was making is that char*'s (and other pointer types as well) in unions make it harder to track ownership and lifetime by introducing implicit dependencies between data and increasing complexity with more code paths.
It's much easier to reason about the correctness of the code, when the fields of the data structures are mutated linearly and independently. This is especially important when the code is maintained by a number of people over long time.
Unfortunately, the C language does not have the capability to automatically check the correctness of memory management and object lifetimes, so the developer has to do their best to ensure the correctness of the code. In doing so, some coding practices can be better than others.
I am talking about generally good and bad coding practices here, not about the formal correctness of that particular piece of code. If some code works, then obviously it is correct, even if the code is obfuscated, non-human-readable or maintainable.
It would help greatly to add the appropriate accents (aigus) to the title of this submission, as I misread this to refer to "resume in C" as in "resuming a suspended continuation in C" :)