Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Test Driven Development

Options
2»

Comments

  • Registered Users Posts: 9,294 ✭✭✭markpb


    Sparks wrote: »
    That's assinine. You give your professional opinion as to how best to do the job, yes; but if management (or the client) says "fine, now do it this way instead", then you have precisely two options; accept your salary and do the job they want; or quit on principal.

    Nope, you give an opinion about how much the job will cost, not how you'll do it. No-one should come to you asking you how you plan to do it, that's why they hired you - because they assume you know how to do your job. If you find yourself explaining unit testing to a sales person, you're doing it wrong. It's none of their business how software is developed, just that it is. If they come back and say that's too long/expensive, then certainly there's room for compromise but the default position shouldn't be: I'll definitely do the development but I'll only test it properly if they let me.

    No-one asks a builder if he's going to make gas lines safe in a building, they just assume that they'll do it as part of their job. No-one asks a ship builder if he'll check that the ship is watertight, they just assume they'll do it. The same should be said about developers. You might assume that it doesn't affect you because you can move on and let someone else deal with the dogs dinner you've left behind but it does --it affects all of us if developers get a reputation for throwing any old code into production.


  • Subscribers Posts: 4,075 ✭✭✭IRLConor


    If you're tight on time and under pressure from management to not write tests then you should take a broader interpretation of the middle D in TDD. Your development can be driven by testing while writing little or no tests. The object of the game is to write testable code. For every piece of code you write think "How can I test this thoroughly?"

    Once you start thinking along those lines then your code quality will improve. Testable code isn't necessarily good, but code that's hard to test is definitely bad. Think about the things that improve testability (reduced coupling, less global state, fewer singletons, not violating the Law of Demeter, no classes/methods with multiple responsibilities, etc), aren't they just general good coding practices?

    If you can spare the time, by all means write the tests for the riskiest parts of your codebase but there's no point getting into a willy waving contest with your manager over methodology. If you can prove that bad practices are costing money you may be able to have a reasoned discussion about it, but "preventative medecine" is not something most companies buy into.


  • Registered Users Posts: 40,038 ✭✭✭✭Sparks


    markpb wrote: »
    Nope, you give an opinion about how much the job will cost, not how you'll do it. No-one should come to you asking you how you plan to do it, that's why they hired you - because they assume you know how to do your job. If you find yourself explaining unit testing to a sales person, you're doing it wrong. It's none of their business how software is developed, just that it is. If they come back and say that's too long/expensive, then certainly there's room for compromise but the default position shouldn't be: I'll definitely do the development but I'll only test it properly if they let me.

    This is an argument that falls down when there's more than one person involved in the development. I'm not on the biggest team in the world, but right now I'm working with about a dozen developers on just one smallish component of the overall product, and even at that point you pretty much have to have more invasive management than you're implying; at least on large projects. So an individual developer can't get told "do task X" and tell the boss it'll take two years - one day for task X and the remainder to shove the entire project into a unit testing framework so he can test his code. That's just not how it works; and on large legacy projects, you can't just add unit testing piecemeal, it has to be an actual directed effort. Depending on the size of the legacy projects, the cost could be enormous - as in, seven figures enormous without breaking a sweat, and possibly eight if things run over; and when the outcome is not a positive revenue gain, but the promise of a lower business cost from engineering in the future.... well, the argument becomes a bit short and one-sided in places where this quarter's figures drive thinking.


  • Moderators, Sports Moderators, Regional Abroad Moderators Posts: 2,646 Mod ✭✭✭✭TrueDub


    I'm with Sparks on this one - twenty years (:eek:) in IT has shown me that in general, as a professional, you have to ultimately do what you're told. You can provide input, or feedback, to a decision, but if the decision does not tally with what you think you have a simple choice.

    As regards TDD, it's definitely easier to start it with a greenfield project. You can do it with an existing project, but it usually requires some re-design, and the knock-on effects can be huge. I'd be lost without unit tests now, and tools like Mockito, and it's plain that the use of these tools & techniques reduces bugs, but for an existing project it can be cheaper to simply fix bugs as they arise than to re-engineer the entire system to be testable.

    There is a middle ground - make things a little cleaner every time you touch the code. Tidy it up, add a test or two, think about testability when making changes. These little things all add up.


  • Subscribers Posts: 4,075 ✭✭✭IRLConor


    TrueDub wrote: »
    There is a middle ground - make things a little cleaner every time you touch the code. Tidy it up, add a test or two, think about testability when making changes. These little things all add up.

    Yup, and take a leaf from the medical profession's book: primum non nocere.

    There are a few circumstances where I'm a little more firm about writing tests:
    • When fixing a bug. You can't honestly fix a bug without reproducing it. That reproduction is really a new test. You have to write it anyway, so you may as well check it in and let others get the benefit of it.
    • When touching code I don't understand. If I don't understand the inputs and outputs of a piece of code I often write some test code to exercise it as a way of understanding its behaviour at a "black box" level.
    • When touching code that could get me fired if I break it badly enough e.g. billing, new user signup, input validation, etc.


  • Advertisement
  • Registered Users Posts: 27,161 ✭✭✭✭GreeBo


    Sparks wrote: »
    Lots. Your average bricklayer isn't being held up as the cause of the economic collapse; the 100 or so developers who gave them their orders, on the other hand, are the ones being hauled into court over debacles like Priory Hall.


    There was s similar defence forjust following orders back in the mid 40s. History didnt think too much of their excuses either.

    You shouldnt be approaching management giving them an option about your ode including tests or not including tests. Tests are part of the thing that you deliver, along with comments/documentation/release notes/configuration etc.

    If you dont have good unit test code coverage and good feature coverage (cucumber etc) then your initial costs are lower but maintenance costs and bugs rocket. As soon as someone refactors your code to add some new feature they have no idea if they have broken the existing functionality.

    If you work for a company where you are just told what to do and you (as the expert) dont control how you write your own code then you are pretty much acting like a cheap, off-shore commodity resource imo.

    If this doesnt bother you then stay where you are, if it does and you have a desire to be the best programmer as you can be then I'd advise moving.

    Personally I use BDD, we write acceptance tests first, then the code and then the unit tests for those individually testable parts of the code.
    If you are working on legacy code then unit tests are a great help before refactoring, depending on the coverage of the tests you can completely

    I fully agree that its a mind shift. Using BDD really means that you know the exact requirements before you ever start writing you code, helping greatly with design decisions.


  • Registered Users Posts: 9,294 ✭✭✭markpb


    TrueDub wrote: »
    There is a middle ground - make things a little cleaner every time you touch the code. Tidy it up, add a test or two, think about testability when making changes. These little things all add up.

    +1

    This is exactly what developers should be doing. No-one expects anyone to write a whole suite of unit tests for legacy code but at the very least, the code you're affecting should have unit tests added before you start.


  • Registered Users Posts: 40,038 ✭✭✭✭Sparks


    GreeBo wrote: »
    There was s similar defence forjust following orders back in the mid 40s.
    I'm pretty sure I've not been ordered to carry out a genocide.
    I mean, I missed a scrum yesterday thanks to a root canal, but I'm pretty sure someone would have mentioned it if killing an entire ethnic group had been added to my backlog...
    If you work for a company where you are just told what to do and you (as the expert) dont control how you write your own code then you are pretty much acting like a cheap, off-shore commodity resource imo.
    If this doesnt bother you then stay where you are, if it does and you have a desire to be the best programmer as you can be then I'd advise moving.
    Do you have an argument that isn't based on the ego? Because "the desire to be the best programmer I can be" isn't a good business case for a chunk of work that could tie up an entire dev team for a year or two, cost up to seven or eight figures for a large enough legacy project, and not produce a single salable externally visible feature.
    If you are working on legacy code then unit tests are a great help before refactoring, depending on the coverage of the tests you can completely
    Do you think this thread has descended into an argument against unit tests?
    It has not.
    We're just pointing out that while unit testing makes life a lot easier for some tasks - and with legacy code, damn near everything gets easier with unit tests - it may not be possible to use them or to get management to sign off on adopting them; and when that happens you just have to grit your teeth and get on with the job, instead of sitting about wishing you had them.


  • Registered Users Posts: 27,161 ✭✭✭✭GreeBo


    Sparks wrote: »
    I'm pretty sure I've not been ordered to carry out a genocide.
    I mean, I missed a scrum yesterday thanks to a root canal, but I'm pretty sure someone would have mentioned it if killing an entire ethnic group had been added to my backlog...
    No, but thats just a scale issue. You are basically saying that you will accept being told to do what you know is wrong because someone is ordering you to do it.
    Sparks wrote: »
    Do you have an argument that isn't based on the ego? Because "the desire to be the best programmer I can be" isn't a good business case for a chunk of work that could tie up an entire dev team for a year or two, cost up to seven or eight figures for a large enough legacy project, and not produce a single salable externally visible feature.
    I pretty sure I had a number of points in my post that could be construed as an argument...<checks>...yep.
    "Ego" is not part of my argument. If you equate ego with a desire to do your job properly then thats a different issue (which Im not qualified to help you with). No one is saying put an entire scrum team on writing unit tests. But throwing your hands up and whining "they wont let me" is a cop out imo.
    (If you are not able to create a use case for writing tests then that is something that I am qualified to help you with)

    I'm guessing by your argument that you and your team dont have any NFR's, since these are not sellable features either, right? I mean no customer is going to know if you use pooled DB connections or if you retry service calls, encrypt sensitive data thats stored in a DB, check for null, catch exceptions, comment your code, etc, etc. But for some reason you (I hope) wouldnt consider allowing management to tell you to write code that didnt include the above basics but have no problem making the maintainability of your code optional. Why you draw this line here makes no sense to me.
    Sparks wrote: »
    Do you think this thread has descended into an argument against unit tests?
    It has not.
    We're just pointing out that while unit testing makes life a lot easier for some tasks - and with legacy code, damn near everything gets easier with unit tests - it may not be possible to use them or to get management to sign off on adopting them; and when that happens you just have to grit your teeth and get on with the job, instead of sitting about wishing you had them.

    and I disagree. Its always possible to use tests of some description. It may not be worth the effort in all cases of the code, but as said above, you should strive to make any code you touch better, even if the bit you have been (t)asked to do is for something else. Thats part or your job.

    For me delivering code without inbuilt tests is the equivalent of delivering code with no documentation or no exception handling. Tests are an integral part of the development of software. I think the idea of testing as some optional overhead that you can choose to leave out is frankly archaic and is what lead to us have a proliferation of crappy untested legacy code that people are afraid to refactor or even touch because they dont understand it and are afraid of braking some unknown element of it.


  • Registered Users Posts: 40,038 ✭✭✭✭Sparks


    GreeBo wrote: »
    No, but thats just a scale issue. You are basically saying that you will accept being told to do what you know is wrong because someone is ordering you to do it.
    I think the day that you can compare methodologies in software development the same way you compare committing genocide with not committing genocide is quite probably the day we should wrap up the entire field and bury it somewhere and just go back to counting on our fingers...
    I pretty sure I had a number of points in my post that could be construed as an argument.
    Indeed, but not good ones.
    For example, "do your job properly" sounds easy enough, but then you ran headfirst into the problem of what "properly" means (or, in your original post, what "best" means). You don't have a metric here. Is "best" the same as "shortest development time" or "has the lowest number of bugs" or is it "lowest development cost" or "lowest maintenance costs" (which will be different) or "meets clients needs most accurately" or "meets clients' stated specifications most fully" (again, not the same thing) or "didn't have to be fired by HR for being unworkable-with", or "could program in a bunch of cool languages we don't actually use"?

    Besides which, "do it this way or you're just not being a good programmer" - that's a fairly extreme statement. Even "goto considered harmful" has caveats and exceptions.
    Why you draw this line here makes no sense to me.
    (a) The phrase "non disclosure agreement" might be something to keep in mind here; and
    (b) When you work on legacy projects, you're almost never the designer. And you can almost never make major changes in an unplanned manner - and when I say unplanned, I'm talking about plans for migration that tend to be measured in years if not decades.

    There are reasons for the line and its location; they seem obvious to me, they may not to you, and that may be down to a different set of experiences.

    I don't particularly subscribe to the notion that whether or not those reasons seem obvious to someone is a good indicator of professional worth.
    Its always possible to use tests of some description.
    Indeed; however, those tests are not always unit tests, and those tests may or may not be of use for TDD.
    For me delivering code without inbuilt tests is the equivalent of delivering code with no documentation or no exception handling.
    Perhaps so; I know that for me, putting unit tests and unit test frameworks into some of the code I deliver would be a firing offence for good reason (for example, legal's concerns over library licencing issues; test's concerns over introducing regressions; and so on).
    Tests are an integral part of the development of software. I think the idea of testing as some optional overhead that you can choose to leave out is frankly archaic and is what lead to us have a proliferation of crappy untested legacy code that people are afraid to refactor or even touch because they dont understand it and are afraid of braking some unknown element of it.
    And I think you are confusing Tests and Unit Tests and seem to have equated one software methodology with good professional conduct; I don't agree with either of those assertions, and the latter is bordering on being impolite.


  • Advertisement
  • Registered Users Posts: 27,161 ✭✭✭✭GreeBo


    Sparks wrote: »
    I think the day that you can compare methodologies in software development the same way you compare committing genocide with not committing genocide is quite probably the day we should wrap up the entire field and bury it somewhere and just go back to counting on our fingers...
    I think you are (deliberately) missing the point here. I am comparing two occasions of people using "I was just following orders" as an excuse for doing what they know is wrong. Sure there is a vast difference in the fallout of both but the underlying cause is the same.

    Sparks wrote: »
    Indeed, but not good ones.
    For example, "do your job properly" sounds easy enough, but then you ran headfirst into the problem of what "properly" means (or, in your original post, what "best" means). You don't have a metric here. Is "best" the same as "shortest development time" or "has the lowest number of bugs" or is it "lowest development cost" or "lowest maintenance costs" (which will be different) or "meets clients needs most accurately" or "meets clients' stated specifications most fully" (again, not the same thing) or "didn't have to be fired by HR for being unworkable-with", or "could program in a bunch of cool languages we don't actually use"?
    Without getting into a long argument about definitions, *I* believe that testability and unit tests are a fundamental in delivering code.
    I think your metric point is really just a straw-man argument. Why are all your options mutually exclusive?
    Do you have a Definition Of Done?
    Sparks wrote: »
    Besides which, "do it this way or you're just not being a good programmer" - that's a fairly extreme statement. Even "goto considered harmful" has caveats and exceptions.
    I'm not prescribing any "way" of providing testing as part of a deliverable, I'm just saying that it, in my opinion, is a fundamental part of delivering code.
    Sparks wrote: »
    (a) The phrase "non disclosure agreement" might be something to keep in mind here; and
    (b) When you work on legacy projects, you're almost never the designer. And you can almost never make major changes in an unplanned manner - and when I say unplanned, I'm talking about plans for migration that tend to be measured in years if not decades.
    a) I cant see how any NDA is going to prevent you from writing tests.
    b) I've worked on several products that are multi million lines of code and have been live for 10+ years.
    Adding Unit Tests doesnt involve making "major unplanned" changes. If so perhaps you are not doing it correctly? (Serious question btw)
    If the legacy product is not going to be sunset for several years (typically the case as "they" figure out if they want a rewrite, refactor or something else) then thats all the more reason to add unit tests as you continue maintenance of that product. When/If you get to rewrite it then you know exactly what those parts are supposed to do. Thats gold for a re-write/refactor project.

    Sparks wrote: »
    There are reasons for the line and its location; they seem obvious to me, they may not to you, and that may be down to a different set of experiences.
    Again, I think thats just an "old-school" acceptance of "Im just a coder, I do what Im told". I believe development has moved on from this and the onus is on the Snr Developers to push back if you are being told not to write tests.
    Sparks wrote: »
    I don't particularly subscribe to the notion that whether or not those reasons seem obvious to someone is a good indicator of professional worth.
    I'm saying that delivering code without tests, to me, is not fully doing your job. In my experience I have never come across an occasion where its not possible to improve things by adding some tests.
    Sparks wrote: »
    Indeed; however, those tests are not always unit tests, and those tests may or may not be of use for TDD.
    Agreed, they may be acceptance tests, integration tests or unit tests. But I'm sure you agree that the more levels of tests you have the more you are covered and the earlier you are going to find issues. I have already stated that I dont "believe" in TDD when it comes to unit tests; however I strongly believe in the existence of unit tests (aiming for 80-85% branch coverage on a greenfields project. With legacy you can only do so much, but you should still be trying)
    Sparks wrote: »
    Perhaps so; I know that for me, putting unit tests and unit test frameworks into some of the code I deliver would be a firing offence for good reason (for example, legal's concerns over library licencing issues; test's concerns over introducing regressions; and so on).

    Maybe Im missing your point here, but I cant see how you could have licencing issues by writing unit tests that are only ever run during a build. Its not like they go to production etc? (Again, perhaps I am missing what you are getting at here...)
    I dont follow what you mean by "introducing regressions"? Surely thats the point of automated tests, to find regression bugs without the need for a bunch of manual QA?
    Sparks wrote: »
    And I think you are confusing Tests and Unit Tests and seem to have equated one software methodology with good professional conduct; I don't agree with either of those assertions, and the latter is bordering on being impolite.
    Again Im not at all saying "If you dont use TDD you are a bad developer".
    I dont even use TDD (I use BDD).
    What I am saying is that if you are delivering code (written using whatever methodology) and you dont write the corresponding tests (of all useful types/levels) then I dont think you are being as good a developer as you could/should be.
    (I'm not in any way attempting to be impolite or imply anything about anyone here. I just honestly dont think there is any reason not to deliver tests with your code, irrespective of when the tests are written, first or years later)

    Its indeed an interesting discussion, but perhaps we have strayed too far off TDD? (at least for this thread):o


  • Registered Users Posts: 40,038 ✭✭✭✭Sparks


    GreeBo wrote: »
    I think you are (deliberately) missing the point here. I am comparing two occasions of people using "I was just following orders" as an excuse for doing what they know is wrong. Sure there is a vast difference in the fallout of both but the underlying cause is the same.
    I don't believe that you really can't tell the difference between fulfilling a contract of employment in the software industry and committing genocide.
    I'm starting, however, to think that you can't see that the two don't have the same ethical roots. Which is a bit of a headwrecker.
    I think your metric point is really just a straw-man argument.
    It's not. It's central to your whole "You have to do TDD to be the best programmer you can be" thesis, because you've no definition of what "best" means; you haven't even said from whose vantage point that "best" should be measured.
    I'm not prescribing any "way" of providing testing as part of a deliverable, I'm just saying that it, in my opinion, is a fundamental part of delivering code.
    And I'm saying that that's not fundamental, it's a modern methodology; and modern here does not necessarily imply "better", just "modern". In my project, we have one group of developers and they don't do testing; test is done by five seperate groups with different aims and mandates. And you might think your way's better, but you'd have to prove that point and you haven't even postulated a metric to make that determination yet.
    a) I cant see how any NDA is going to prevent you from writing tests.
    No, but it might affect how specific I can be on how it's done in the places I've worked in; as it should for you as well...
    b) I've worked on several products that are multi million lines of code and have been live for 10+ years.
    A measuring contest? :D
    We have tools here that beat those numbers, let alone released products.
    Adding Unit Tests doesnt involve making "major unplanned" changes. If so perhaps you are not doing it correctly? (Serious question btw)
    Adding Unit Tests may not; adding Unit Testing might.
    If the legacy product is not going to be sunset for several years (typically the case as "they" figure out if they want a rewrite, refactor or something else) then thats all the more reason to add unit tests as you continue maintenance of that product. When/If you get to rewrite it then you know exactly what those parts are supposed to do. Thats gold for a re-write/refactor project.
    You just proposed a project that - for this product - would cost between seven and eight figures, take up at least half the development team (ie. several hundred people) as well as quite a lot of other people for at least a year, and produce no salable features at the end of that. That's a lot for a company to eat. And since there's an existing test framework in place (just not one you'd recognise unless you've been in the industry for a few decades, but whose effectiveness is proven), you need a better argument than "it's better to do it this way" or "we might save money down the line".

    Ideology doesn't last long when it costs eight figures to hold to it.
    Again, I think thats just an "old-school" acceptance of "Im just a coder, I do what Im told".
    Try "I'm a professional, I fulfill my contracts". And don't forget that you do have a choice - you can always quit (and it's hardly hardship to do so).
    I believe development has moved on from this and the onus is on the Snr Developers to push back if you are being told not to write tests.
    You can't say things like that without more context, because while that might makes sense in a small team in an SME/startup (and I've been in places where it did make sense and was done); it doesn't make sense in every place. You've got a dose of tunnel vision here.
    I'm saying that delivering code without tests, to me, is not fully doing your job. In my experience I have never come across an occasion where its not possible to improve things by adding some tests.
    I have. I'm living in one (at least with the kind of testing you're talking about). It mightn't happen at the web development level or the small app development level, but with large and very large systems, it does. We don't know how to do very large development projects at a fundamental level as an industry; every study of project success/failure rates and causes shows this. Saying we do and pointing to methodologies developed for and tested on much much much (ie. two to three orders of magnitude) smaller projects as proof is either not bothering to think it through or not understanding the situation.
    Maybe Im missing your point here, but I cant see how you could have licencing issues by writing unit tests that are only ever run during a build.
    You can put damn near anything into a licence agreement. Look at the JSON licence sometime and try to figure out how a company with contracts for any military could comply with it's "don't be evil" clause. There are entire legal departments whose job this is, and a software engineer saying they know better is a fairly good example of hubris; I might not like lawyers but I don't think they're faking their workloads...
    I dont follow what you mean by "introducing regressions"? Surely thats the point of automated tests, to find regression bugs without the need for a bunch of manual QA?
    And if the tests you introduce conflict with existing tests maintained by a different team, that you have no ability to test for because to run those tests takes more hardware than you have access to?
    What I am saying is that if you are delivering code (written using whatever methodology) and you dont write the corresponding tests (of all useful types/levels) then I dont think you are being as good a developer as you could/should be.
    And I'm trying to explain that that's just not true except in the small scale where you have a small development team.


  • Subscribers Posts: 4,075 ✭✭✭IRLConor


    Sparks wrote: »
    And I'm trying to explain that that's just not true except in the small scale where you have a small development team.

    In fairness, your company is an unusually large one. The vast majority of people work in much smaller companies - even the people who work for pretty big companies work for smaller companies than you do. Hell, your company employs more lawyers than the number of people who work with me (based on some estimates I've seen possibly even by an order of magnitude).


  • Registered Users Posts: 27,161 ✭✭✭✭GreeBo


    Sparks wrote: »
    I don't believe that you really can't tell the difference between fulfilling a contract of employment in the software industry and committing genocide.
    I'm starting, however, to think that you can't see that the two don't have the same ethical roots. Which is a bit of a headwrecker.
    Again, you are stuck comparing the results. I'm talking about the problems that are caused by people shirking responsibility about doing the wrong thing just because someone "above" tells them to. You clearly dont want to accept this point so I suggest we drop it.
    Sparks wrote: »
    It's not. It's central to your whole "You have to do TDD to be the best programmer you can be" thesis, because you've no definition of what "best" means; you haven't even said from whose vantage point that "best" should be measured.
    Ah and you see theres the rub. I *never* said you had to do TDD to be the best programmer you can be. In fact if you read my first post you can see where I say that I dont even do TDD as I dont believe in it.
    My whole point is that you should deliver some sort of testing with your code.
    Sparks wrote: »
    And I'm saying that that's not fundamental, it's a modern methodology; and modern here does not necessarily imply "better", just "modern". In my project, we have one group of developers and they don't do testing; test is done by five seperate groups with different aims and mandates. And you might think your way's better, but you'd have to prove that point and you haven't even postulated a metric to make that determination yet.
    Again you are arguing against a point I never made. Im not arguing about when you write unit tests, Im arguing that they should be written. I thought Id made this pretty clear tbh.
    Sparks wrote: »
    A measuring contest? :D
    We have tools here that beat those numbers, let alone released products.
    Not at all, I was merely responding to your "when you work on legacy..." point which implied that I hadnt.
    Sparks wrote: »
    Adding Unit Tests may not; adding Unit Testing might.
    So to test individual logical blocks of the code you write would require major unplanned changes? Thats a flashing potential design issue for me.
    Sparks wrote: »
    You just proposed a project that - for this product - would cost between seven and eight figures, take up at least half the development team (ie. several hundred people) as well as quite a lot of other people for at least a year, and produce no salable features at the end of that. That's a lot for a company to eat. And since there's an existing test framework in place (just not one you'd recognise unless you've been in the industry for a few decades, but whose effectiveness is proven), you need a better argument than "it's better to do it this way" or "we might save money down the line".
    I'm not sure when I proposed any project, I'm almost certain I said that when working on legacy systems you should incrementally improve them as you touch bits of them.
    Sparks wrote: »
    Ideology doesn't last long when it costs eight figures to hold to it.
    Ignoring testing doesn't last long when it costs 8 figures to maintain the code and deliver nothing. Without specific examples we can throw these weighty statements around all day and they mean nothing.
    Sparks wrote: »
    Try "I'm a professional, I fulfill my contracts". And don't forget that you do have a choice - you can always quit (and it's hardly hardship to do so).
    Is this directed at me? ?You havent provided a definition of "fulfill". To me fulfilling my contract is delivering code with associated levels of test.
    If I was in an environment where I was threatened with dismissal for wasting time writing tests then I wouldnt last there too long.
    Sparks wrote: »
    You can't say things like that without more context, because while that might makes sense in a small team in an SME/startup (and I've been in places where it did make sense and was done); it doesn't make sense in every place. You've got a dose of tunnel vision here.
    You seem to be pigeon holing me to working on SME's without any knowledge of what I've worked on. *I'm* not trying to turn this into a pissing contest but...

    how can it ever make sense for Snr Devs to not push back on being told to not write tests for their code?
    Sparks wrote: »
    I have. I'm living in one (at least with the kind of testing you're talking about). It mightn't happen at the web development level or the small app development level, but with large and very large systems, it does. We don't know how to do very large development projects at a fundamental level as an industry; every study of project success/failure rates and causes shows this. Saying we do and pointing to methodologies developed for and tested on much much much (ie. two to three orders of magnitude) smaller projects as proof is either not bothering to think it through or not understanding the situation.
    Again I'm not pointing at any methodology. I think its been pretty well shown that the sooner you find bugs the cheaper they are and that running automated regression/unit/acceptance tests as part of a build is the quickest/cheapest way to find them.
    The larger and more integration points a project(s) has the more important it is to be able to quickly identify what has broken and what has caused it.
    Sparks wrote: »
    You can put damn near anything into a licence agreement. Look at the JSON licence sometime and try to figure out how a company with contracts for any military could comply with it's "don't be evil" clause. There are entire legal departments whose job this is, and a software engineer saying they know better is a fairly good example of hubris; I might not like lawyers but I don't think they're faking their workloads...
    Im not sure where I said lawyers were faking their workload...
    Anyway, I cant see why you'd be using JSON (or any 3rd party library) in your tests and not in your production code.
    Sparks wrote: »
    And if the tests you introduce conflict with existing tests maintained by a different team, that you have no ability to test for because to run those tests takes more hardware than you have access to?
    Define conflict? If your tests are breaking something then you need to learn how to write better tests. If your test results disagree with someone elses results then someone doesnt understand the requirements.
    If hardware is the issue you are probably either talking about performance testing or integration testing. Hardware should be pretty irrelevant for unit tests. You are trying to test the logical parts of your code.
    You dont need to bring down an actual data centre and failover to another one to test the individual steps of the process.
    Sparks wrote: »
    And I'm trying to explain that that's just not true except in the small scale where you have a small development team.

    First define "small".
    You work for IBM. Great. That doesnt make anyone smaller than IBM automatically "small" and suddenly invalidate my arguments that apply to the majority of companies. Also, Im almost certain the half a million employees in IBM are not all working on developing integrated (in an EIP sense) software projects. If that argument worked I could mention the holding company I work for and "beat" you. We are talking about development projects not the size of companies.

    Then explain to me in simple English (because Im clearly missing something) why you cant write code to test other code you have written because of the size of your development team, honestly it just sounds like an excuse to me.


  • Registered Users Posts: 40,038 ✭✭✭✭Sparks


    IRLConor wrote: »
    In fairness, your company is an unusually large one. The vast majority of people work in much smaller companies - even the people who work for pretty big companies work for smaller companies than you do. Hell, your company employs more lawyers than the number of people who work with me (based on some estimates I've seen possibly even by an order of magnitude).
    ....thats a pretty fair point allright!


  • Registered Users Posts: 40,038 ✭✭✭✭Sparks


    GreeBo wrote: »
    Again, you are stuck comparing the results. I'm talking about the problems that are caused by people shirking responsibility about doing the wrong thing just because someone "above" tells them to. You clearly dont want to accept this point so I suggest we drop it.
    I don't want to accept it because you're wrong about it being the wrong thing and about abdicating responsibility.
    (1) You've not proven it's the wrong thing, you just think it is; and
    (2) You don't abdicate responsibility; you either do the job or quit - the point is that this third option you're pushing doesn't really exist.
    So to test individual logical blocks of the code you write would require major unplanned changes? Thats a flashing potential design issue for me.
    And now we're in NDA territory.
    I can say that you're incorrect or correct depending on the size of the logical block you're considering, but I don't think I can go much further than that.
    You havent provided a definition of "fulfill". To me fulfilling my contract is delivering code with associated levels of test.
    To me, it means complying with the actual written job contract that I signed.
    If I was in an environment where I was threatened with dismissal for wasting time writing tests then I wouldnt last there too long.
    Risking nine or ten figures worth of revenue in a licencing lawsuit because you used a library or framework that wasn't cleared by legal isn't seen as being reasonable here; and given the litigious nature of the industry these days from patent trolls to samsung and apple, it's hard to completely dismiss their concerns.
    how can it ever make sense for Snr Devs to not push back on being told to not write tests for their code?
    When there are five other groups with more people in them writing tests for the code, to give my own example.
    I think its been pretty well shown that the sooner you find bugs the cheaper they are and that running automated regression/unit/acceptance tests as part of a build is the quickest/cheapest way to find them.
    Not at this scale it hasn't been (the latter I mean, not the former). It's been tried and it's been shown not to be the fastest way to work with a codebase this large. Experimentally. In anger.
    Anyway, I cant see why you'd be using JSON (or any 3rd party library) in your tests and not in your production code.
    The point wasn't JSON; it was JSON's licence clause as an example of how licences can have wierd clauses. Nothing says you can't have a licence that requires payment for use even if the code's not deployed in production.
    Define conflict?
    NDA again, but if I bolt on code for testing that breaks legacy testing (which is possible), then the problem isn't someone else's.
    If hardware is the issue you are probably either talking about performance testing or integration testing. Hardware should be pretty irrelevant for unit tests. You are trying to test the logical parts of your code.
    You dont need to bring down an actual data centre and failover to another one to test the individual steps of the process.
    You do for some of our tests.
    And that happens to be the area I'm working in, so I know that some of the tests just can't be done by me.
    First define "small".
    I don't have a solid line, but lets say that large for me equates to several tens of MLOCS, several decades worth of work, hundreds of developers in multiple sites, outnumbered a few times by testers; and small would be where you can know everyone's name on the product team and hold all the codebase's general block diagram in your head with less than a year's work.
    Then explain to me in simple English (because Im clearly missing something) why you cant write code to test other code you have written because of the size of your development team, honestly it just sounds like an excuse to me.
    In simple english, because it's not my job. I'm paid to do other things; other people are paid to write tests. At the risk of monumental oversimplification in order to avoid NDA violation: there's a spec; I write to it, they test to it.

    If I don't want to do it that way, I can leave - transfer to another product team where they do it a different way, or just quit altogether. If I don't want to leave, I do the job the way they want it done. There is no third option where I dictate to men and women with thirty years of experience in the product how to do their job and restructure over a thousand devs and testers and managers spread all over the world, and it would be deeply unprofessional not to mention deeply arrogant to try to do so.


  • Registered Users Posts: 27,161 ✭✭✭✭GreeBo


    Sparks wrote: »
    To me, it means complying with the actual written job contract that I signed.
    So, simply, if they dont ask for it you dont deliver it. Does that extend to documentation, naming conventions, error/exception handling and all the other NFR that are typically associated with software development?

    To me thats acting like a cheap off shore contractor that you have to specify pathetic details to otherwise you get junk back.

    In my mind you are using your current situation (working for one of if not the biggest software house in the world) to colour your situation. Using your definition of "large" how many large companies are there? What percentage of companies do you think falls into "large"?
    I think your "tunnel vision" jibe was spot on, you just aimed it incorrectly.

    I think delivering tests is a fundamental part of writing code and something every developer should strive for (irrespective of methodology) you dont.

    I think we will have to agree to disagree on this one tbh.
    Enjoyed the conversation, thx but I'm out.


  • Moderators, Sports Moderators, Regional Abroad Moderators Posts: 2,646 Mod ✭✭✭✭TrueDub


    I've worked in similar situations to both Sparks and GreeBo, so here's my take on the situation:

    Situations similar to Sparks: you do what you're asked. This is driven by project setup (everyone has a specific task, don't cut across each other), managerial fiat and convention - if that's how it's done here, it's how we do. I had 4 serious jobs (18 months plus) and the first 3 were like this. We coded, the testers tested and we all moved on. It worked (these were BIG companies who are all still around), and the job got done.

    Situations similar to GreeBo: you've more freedom to do things the way you want. My current job is very similar to GreeBo's, and we do things the way we feel achieves the job best. This is BDD, with lots of automated tests, and it pays off. We do this because we're able to, though, and because we've proved that it works for the company. Other teams & projects are not so lucky and do not get to do things this way.

    My ultimate point: people will do the job as best they can given the circumstances of the job. If the culture says do this, you do it. Even in my current role, where we've a lot of freedom, there's lots I'd do differently, but the restrictions are there and we get on with it.

    I can't say one way is "right" and one "wrong", because from job to job, project to project, the "right" way varies. Part of being professional is trying to improve what you can, by example or by suggestion, and another part is accepting that certain restrictions won't change anytime soon and finding a way around it.

    My 2 cents: I'd be very reluctant to go back to an approach where there's no BDD and no automated unit test suites. However, if it was the difference between eating and not eating, I'd probably even go back to COBOL. :mad:


Advertisement