Location: Chattanooga, TN
Remote: Yes
Willing to relocate: Yes, with relocation assistance.
Technologies: React, Python, Django, Node, Typescript, Postgres, C#
LinkedIn: https://www.linkedin.com/in/charles-taylor-349609133/
GitHub: https://github.com/CharlesTaylor7
Email: My first and last name can be seen from Github and LinkedIn. As a python formatted string: f"{first_name}{last_name}95@gmail.com"
Resume: please email me if you'd like a PDF version of my resume. It's basically the same as my LinkedIn profile.
I'm a full stack software engineer who is effective in fast paced startup environments. I appreciate and don't stay away from challenge.
I'm a big fan of statically typed languages, and automatic testing, (both unit and e2e).
I have professional experience in the above listed technologies, but I am also to keen to begin working professionally with Rust. I generally default to building my side projects in rust these days.
Yes, that is what I meant. You may as well include the prefetch. And if the browser (or the user) doesn't want the prefetch they just get a slower load. If the user enables them the get the snappier experience.
Empty string and none are not the same or in any way redundant.
Linters belong in ci. Humans belong in code review.
I can actually explain to a human why my text field has a default of none.
It's a code-smell in Django, but I agree that this kind of thing shouldn't be "auto-fixed" by a bot. There are perfectly valid reasons why you might want to eschew the convention.
> Avoid using null on string-based fields such as CharField and TextField. If a string-based field has null=True, that means it has two possible values for “no data”: NULL, and the empty string. In most cases, it’s redundant to have two possible values for “no data;” the Django convention is to use the empty string, not NULL. One exception is when a CharField has both unique=True and blank=True set. In this situation, null=True is required to avoid unique constraint violations when saving multiple objects with blank values.
Perhaps I need to clarify "auto fix" - the dev still needs to click "commit" on the suggestion. Given a PR with 3 Django Doctor suggestions, 2 could be ignored and only 1 committed by the dev if they choose.
Unfortunately it's tricky to add new fields in a zero-downtime fashion without making them nullable. Django does not store defaults in the DB (this is surprising, and the opposite of the Rails approach), and this means that if you upgrade your DB first, until you've upgraded your application all writes to your model are going to fail because the (down-level) application code doesn't know it has to specify the (up-level) DB field.
From chatting to folks, most applications just YOLO this, and accept downtime on every such DB operation; you might not even notice that this is happening at first because you'd need to have a create _during_ your DB migration. It'll bite you sooner or later though. I'm sure for many apps a client-side retry is totally acceptable, assuming it's safe to do so.
Sure, but if it's outside the usual convention it should be explained with a comment. The comment can also include a command to tell the linter to ignore it.
Bear in the comments are "food for thought" and aren't aimed at blocking merge. In the review you can pick which suggestions you want and can ignore the stuff that does not add value to you.
Soon we will be adding a config file so you can mute the things that ar vexatious
These languages aren't even try to capture the same market niches.
Swift is literally vendor locked. How the heck is it next to open source runs anywhere languages?
There is a distinction between necessary complexity and "busy beaver" artificial complexity. And it turns out that there's a pretty compelling mapping between energy and information and fundamental complexity. There's a reason they both use a concept of 'entropy'.
That said, there's no denying that there's probably a lot of unnecessary complexity in most software.
I'm sorry without some citations this is still handwavy bollocks.
Energy does have a conservation law. Entropy does not. Not does information, complexity etc.
I majored in phyiscs, I'm familiar with these terms.
Grand parent post made no indication they were talking about essential complexity when they made their bold assertion about how it's conserved like energy.
I'm also not aware of any formal or precise way of measuring "essential complexity". Not in the way we do for energy or entropy.
If only there were some scientific discipline that studied complexity and proved that problems possess some essential, minimal, complexity below which no implementation can go...
If anyone proved such a result, I'm sure they would be awarded a prize of some kind.
Well, there are many complexities. People here are talking about the Programming Complexity (of which one measure is Cyclomatic Complexity) of a textual program vs. the Time/Space complexity of the same program. There is the weakly-related concept of the Kolmogorov complexity of a string.
It's obvious that the time/space complexity of a program can stay fixed while you arbitrarily raise the programming complexity of a program by arbitrarily raising the number of branches that are rarely taken, an action that won't affect the asymptotic time complexity of a thing.
And sure, you can talk about the Kolmogorov complexity of a string of computer code, but the minimal string representation of that code is unlikely to be one that a programmer would describe as simple. Even minimizing the string that would behave as that of the original program is usually non-desirable.
Pick any of those. Given a problem, there is a minimal complexity to the programs that solve it. You can always raise the complexity, but not reduce it.
I'm confused by your response to ninjapenguin[0]. Were you agreeing with him or disagreeing or something else? I interpreted your response to be disagreement. I feel that your original style of phrasing "If only x etc. etc." is not easy to understand. I'm having a hard time placing your subsequent comments in context of that original response.
I was disagreeing with the statement that "Comparing [complexity] to something as fundamental as energy is pure bollocks." There is, in fact, a discipline that studies complexity -- computer science, and, in particular the field of complexity theory. Complexity was, in fact, found to be fundamental and "irreducible". Two people, Hartmanis and Stearns, did, in fact, discover that through a comparison to physics [1] in 1965, and for that discovery -- that led to the creation of complexity theory -- they won the Turing Award in 1993.
Not really. You can describe the complexity of "code understanding" as the computational complexity required to answer a question about it, say the time it takes to find it or the length of the proof -- both are common computational complexity measures. The proof complexity of determining that bubble-sort sorts is, indeed, shorter than that of Timsort. You could even talk about the simplest sorting algorithm as the one with the shortest proof. There is, indeed, a minimal proof complexity for sorting, just as with any other kind of computational complexity.
No, they're completely different uses of the term.
A lot of real-world CRUD code, for instance, does nothing of computational interest whatsoever, but it isn't simple, because it has to cope with real-world business-logic complexity. (And possibly a lot of complexity beyond the necessary complexity, due to poor design, changing goals slowly messing up the code, etc.)
> You can describe the complexity of "code understanding" as the computational complexity required to answer a question about it, say the time it takes to find it or the length of the proof
I don't follow. How are proofs relevant? A programmer having to wade through poorly-named functions and a lack of documentation, is not well modelled by computational complexity theory.
> You could even talk about the simplest sorting algorithm as the one with the shortest proof.
Who'd care? That bears little relation to either the complexity theoretic properties of the algorithm/problem, or to complexity in the informal sense.
They are one way of measuring how hard or easy it is to know something, and I define simple as easy to answer questions about, and complex as the opposite.
Introduce enterprise into any system and complexity goes over the roof ASAP - suddenly its not only your machine but TEAM of people, history, dashboards with metrics, logs, automation, security etc...
This one isn't even approaching any of it except dev side and CI.
I think the analogy to energy is pretty good. Like the FizzBuzz thing, you can also build a machine that has a lot of energy but doesn't accomplish anything useful.
The conservation of energy, specifically, is a misleading analogy. If you want an energy-related analogy, I think you will find a better one in entropy (especially given its close relationship to information.)
I've had the opportunity to do this fornat twice and I think it works well.
Here's how it works:
They ask you to pick a 3 hour time window to work on the project.
You pick the time.
They send you the project prompt in an email right at the beginning of your time interval.
You have 3 hours to read the prompt, design, implement, test a solution.
If you miss the deadline, project counts for nothing.
Both the times I did this the prompts were absolutely doable in 3 hours.
They were the kind thing you could do in about 2 hours if you had the ability to read the project prompt ahead of time and think about it for a few days.
Personally, I work well with time constraints. This format feels closer to a workday where mid-afternoon you might challenge yourself to finish a task before 5pm. Its a lot less pressure than having to juggle a conversation and having your code scrutinized in real time.
It forces a real time constraint that keeps you from piling time into something you just don't know.
Its hard to cheat, and it also forces the company to make the task achievable in the time frame. This format forces them to realize when the task is too big for anyone to finish in the time.
I hate time constraints. That is why I hate whiteboarding as well.
My brain just not work that way. If I have to modify something in a project I already worked on. That's fine. But interviews always ask tho build something from the ground up.
Once I got the requirements I need a few hours to think about it, before I write any actual code. I usually do something else while I do this. On a job, it's when I home and do some chores and my mind can wander around freely. But I can't make this process much faster. If I'm on the clock I have to sit down and start spitting out code. Also, there is no time for any major refactoring if you went in a wrong direction. This adds a lot of unnecessary stress.
I like it when you have a day or an afternoon to finish a 3 hours long task. I work faster if there's no time pressure.
The real world has time constraints, so I don't see this as unreasonable expectation in an interview.
> I need a few hours to think...before I write any actual code...I work faster if there's no time pressure.
It sounds like you work better but not faster. If something absolutely must be done in 3 hours, you usually will not have 2 hours to do the dishes while you let it marinate.
To be clear, I prefer to let a problem sit in my head for a bit, too. I think the end result is better in most cases, for most people. But the real world has problems with hard and fast time constraints, or huge financial incentives to get something working in ASAP and clean it up later if necessary.
But how many of those involve building a brand new thing you've never thought about from scratch? As the person you responded to said, if they're modifying something they've already worked on, that's different.
If something /brand new/ needs to be built in HOURS with huge financial incentives, that is a failure on so many levels of business that blaming the engineer for their inability to actually get it out the door in 3 hours is ridiculous
Yes, I completely fine with actual hard time pressure, when I know the codebase. When I'm on call and something goes wrong I can fix the code very fast and reliably.
When I presented with a completely new problem I have to think about what is the best tool for the job, how I want to structure my code, how can I test it, what are the edge cases, etc.
Yes. Though in practice I'd give people some grace period after the three hours: the three hour time limit is to help the candidate not commit endless hours to the project, it's not meant as a punishment.
I agree here - given the pressures of an interview are much higher than on the job, enforcing a time limit should only prevent extremes rather than add yet more pressure.
I think I’d be fine if someone gave me 3 hours to complete a typical take home exercise, but I worry that some exceptional engineers I know might not be. Applying excessive time pressure could lead to cultural homogeneity.
Yup I've done a couple interviews like that. They send the email at the time of your choosing and you have to submit the finished challenge at whatever deadline. Works for me! I can do things how I want to, just as if I had been given such a task on the job.
> They send you the project prompt in an email right at the beginning of your time interval
Whoops, your home internet has gone down. Or Gmail has decided it's not having any mail from their domain. Or your mail provider think they're spam. Or your mail client doesn't pick up the email for half an hour. There's a whole raft of problems coming from "email delivery is reliable enough to be a timer."
I'm a big fan of statically typed languages, and automatic testing, (both unit and e2e).
I have professional experience in the above listed technologies, but I am also to keen to begin working professionally with Rust. I generally default to building my side projects in rust these days.