Hacker News new | past | comments | ask | show | jobs | submit login

Building a simple debugging tool to monitor your program for possible inconsistently reproducible bugs and having unit tests are not mutually exclusive concept. You're getting downvoted because you're making condescending assumptions about the developer.



I'm not being intentionally condescending. it is a valid question to ask that. i come from a strongly tdd background. i can't tell you how many times i've seen people write simple debugging tools only to have the tool be the source of the bug. ever written a println and output the wrong variable, then scratch your head why the results are wrong?

websockets themselves are well debugged, similar to http. years ago, i stopped spinning up a http server just to test my endpoints. now i just call the functions directly. thus, it would be an issue with the OPs code that is sending/receiving. take websockets out of the equation and everything would work fine with tests.


> Come from a strongly tdd background

Friend, we need to be damned careful about how we view others code when they don't have tests or embrace TDD.

When we work with TDD or even a well tested codebase, we do benefit enormously.

We also risk becoming evangelical, and that almost always comes across as condescending and off-putting.

TDD Tests do not capture every possible point of faillure. There's clear use cases for things like Postman, Debugging Proxys, or OPs _debugging_ tool.

PSA - avoid being a hype monger or evangelist of anything, it's about the worst way to support practices or tools you believe in. Trust me, I'm not only a practitioner of TDD but also a Lisp advocate and Emacs user... All these things are the best ever thing that happened /s

No, the whole world won't ever be a place where everyone is "doing it right" by your estimation, or mine, or anyone else's.

Quite the opposite, so settle in and mellow your jets.


good response.

i'm not being evangelical by asking a question.

i'm also not trying to start a sermon. my response was to explain where i was coming from because i was being called condescending.

i don't care about doing it right. i can ask why someone would write a tool, instead of using one of the existing ones... or writing tests.

my jets are plenty mellow.


>no wonder we end up with such buggy junk software... people... write tests!

absolutely condescending and holier than thou. I view TDD as a premature optimization when writing software that isn't simplistic or cut-and-dry CRUD. Trying to do something new doesn't always mean there's a known output before writing the actual code. Experimentation is a valid thing in software development and writing tests before writing the code would stifle some types of development. In a lot of cases it's bass-ackwards to write tests before the implementation is producing a useful result. I know this type of development might seem foreign to some, but you don't always know what you don't know before you start experimenting while writing code, and TDD doesn't really fit all types of development.


> absolutely condescending and holier than thou.

That was an update after I was downvoted.

You're right, there are reasons to not always do TDD. Writing a tool to inspect websockets, isn't one of them.


>Writing a tool to inspect websockets, isn't one of them.

That's assuming quite a bit and not at all true in all cases. I deal with websockets with wireless IoT devices and I can tell you that your insistence that debugging websockets doesn't require writing any custom tools is just wrong.

So great, you wrote a test that passes. Your test runs it once. Maybe your test runs it twice. But what happens when the IoT device fails in an unexpected way after the 150th run? Your test isn't going to catch that. If you ran tests repetitively then your test is never going to finish in a reasonable amount of time. Do you think running the tests for an entire day is reasonable? No. That's why sometimes building a "debugging CLI" like OP wrote is needed.

Not everything can always be debugged with a simple test, or any kind of test. Network congestion effects can be tricky to work out. Memory leaks in remote devices you don't control. There's a whole lot of reasons why tests aren't always the catch-all some people think they are. I'm not saying don't write tests, just don't go around suggesting they're a cure-all.


It isn't easy, but all of this can (and should) be tested. It will produce far better and more reliable code. You can certainly run tests 151 times. In fact, if you think you're going to find a bug after that many runs, a test suite is perfect for automating the execution.

Don't believe me? Jespen is a great example of difficult distributed testing done well: https://jepsen.io/


You sound like you have TDD Stockholm Syndrome and can't fathom any other kind of software development.


I wouldn't describe TDD as captive or abusive.


It's definitely captive depending on who leads the team. I've done it before. And it can be abusive if the CTO demands 100% code coverage. It gets to the point where writing any test, even a bad one is better than delaying a release just for the sake of "100% coverage". And that was at an early-stage startup where we had to move fast - and guess what, they went out of business in 6 months because we couldn't get any kind of product out. It was hilariously stupid to demand 100% code coverage on a startup without any product. TDD and testing isn't a cure-all, and in some cases it's a curse.


Your CTO was captive/abusive, not TDD. You're describing your own SS and trying to equate that with "all TDD is bad". I said nothing about 100% test coverage or poor management skills.


In my case, TDD was absolutely the problem with the dev team failing to be able to deliver in a timely way - it wouldn't have matter if the CTO mandated 100% test coverage or 75%. Anyone who thinks TDD doesn't slow a team down to a crawl is just fooling themselves, hence the Stockholm syndrome. In my view, TDD is bad for most projects, but tests are not. TDD is putting the cart before the horse. Tests should be written once the functionality is working well. Then write tests. There are very few kinds of development where TDD is the right way to go, and they involve launching space shuttles, designing medical equipment, and operating nuclear power plants. Launching a social media site? No, just no.


>Writing a tool to inspect websockets, isn't one of them.

What kind of developer makes a blanket statement like that? There are all kinds of situations where debugging is more like detective work. When you discover the problem, then you have the knowledge necessaryto actually write the test that fails and which you can then make succeed.


> What kind of developer makes a blanket statement like that?

One with over 20 years of experience building software used by millions of people daily.

You're trying to generalize a very specific conversation.

OP was talking about building a tool to debug websockets. You don't need to debug websockets, you need to debug your code that is sending and receiving data _over_ websockets. In that case, you don't even need websockets at all. Meaning you don't need a tool to inspect the data either. Just make sure that your code is sending and receiving the correct data and you're done.

Of course, if you want to build a tool, go ahead... that's your choice, but my question is still valid. Why not write tests?


If people don't have the same positive relationship with tests that you do, I could easily see "writing tests" generate enough negative sentiment to seriously degrade a person's capacity engage the near whimsical part of the brain that allows one to think laterally at non-trivial 'distance' from the obvious problem. It could easily be as simple as writing tools is more inspiring / engaging to the OP than writing tests.


I'll chime in here because it looks like you've had a terrible time with someone who's a [insert word of choice].

TDD "evangelism" is often toxic and abusive. The actual practice of a functioning TEAM (not just the devs) who know what it is and how it works, changes things fundamentally for the better.

Spikes are used to do the "whimsical" how the F do we do Y, parts. XP devs don't TDD those and the code is usually tossed out (with the good bits used to help build the production code.)

Similarly TDD isn't a religion, and only an fool would apply it like one. The good parts are, you get a test suite, you get developer examples of how to use a unit, you get an extra step to think about naming, so you're _slightly_ less likely to be saddled with a poorly thought out API / naming scheme.

It's broadly useful, helps teams integrate code, much faster.

Everyone needs to be playing the game, though, and they need to know the rules.

C2 wiki is about the best place to find organic conversations about XP and TDD (archived though, so no sanctimonious claptrap from the likes of me), and the wide internet is a great place to get polarized.

I'm sorry you got a bad taste, and it may have put you off, but a functioning TDD team is about the sanest, quickest, dev team you'll have the pleasure of working with. While, as you've found, an oppressive diktat to DO TDD PROPERLY when the team isn't really doing that, is hellish.


You're 100% right and that's a large part of the problem here.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: