Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

A Paradigm shift???

Options
  • 26-12-2011 4:05pm
    #1
    Moderators, Business & Finance Moderators Posts: 10,258 Mod ✭✭✭✭


    This thread was prompt by comments else where about .Net and while I don't want to start yet another my language is better than your thread, I do think there are a few points worth considering...

    When I started out, I remember having a conversation with an "Old" programmer, who was not impressed with my new Windows 3.0 box... his comment was something like "look we only do two things in IT - centralize or decentralize things". Over the years I've marveled at how accurate he was - the technology changes, but the underlying strategy is always one of the two!

    Right now I see several interesting things happening in our industry:

    First of all "Bring Your Own Device" BYOD is slowly starting to hit the work place - if you have not heard of the concept it goes like this: your employer gives you a sum of money and lets you buy what ever device or devices you want to use at work (from a preferred list of course). The employer benefits on two levels, first he can expense the payment to you rather than capitalize it and second he can reduce his help desk costs, since he is no longer responsible for the device - it's between you and the vendor.

    The vendors are becoming consumer rather than corporate oriented, one consequence of this is that the proposed software distribution model is aimed at users buying applications, rather than corporations trying to distribute in house applications. If you spend any amount of time digging through Apple's AppShop, you will eventually find a collection of applications that are front-ends for various vendor applications - they usually contain a warning that this is the front-end for the XYZ Server, if you don't have a license for it, this app will not work bla, bla, bla - clearly such a model is not going to be suitable for corporates going forward, so some kind of alternative is required...

    "Steve Jobs (RIP) said so", all the vendors are adapting HTML 5.0 at a rapid rate, Apple suggests that the best way for corporates to write IPhone/IPad app is to use HTML 5.0, while at the same time Adobe has announced that it is cutting it's investment in Flash big time to concentrate on HTML 5.0.

    Early indications are that Windows 8 will offer an alternative programming model to .Net as we know it, here is a quick overview from The Register. Whether Win 8 is a success or not remains to be seen, but clearly Microsoft's commitment to .Net in it's current format is under review... it is interesting though that COM has reemerged!

    The IPhone mentality is starting to prevail - What I mean by this is that as people get more and more used to working with mobile applications, they are starting to expect the same of the desktop - short attention spans, complex tasks achieved through simple steps, more intelligent applications, multitasking not so much and so on...

    So what does this mean for us programmers?

    Well with corporates being forced to deal with a device agnostic environment, they will have to move most of their application logic to the servers and rely even more so on browser based front ends. HTML 5.0 is certainly well capable of doing this, so it should happen. Third party ISV who sell to the corporate market will follow the same trend. This being the case developers working at corporate IT departments, should start to develop HTML 5.0 skills, if they have not already started. Skills in programming devices in native mode will probably not be as important... but an understanding of how these devices work will still be useful.

    You will probably see the reverse among commercial software houses and games developers, where skills in native device programming will be to the front, with HTML 5.0 of minor interest. As with corporates, though, complex applications will require a backend solution....

    Regardless of the technology, the demand will be for simpler UIs, but at the same time more intelligent applications that try to anticipate what the user wants to do and helps them get there, simple examples would be things like filtering options in an input form based on location awareness, contact lists, previous orders etc... Developing good HCI (Human Computer Interaction) skills is something we need to do...

    Regardless of the backend technology being used, we are going to have to develop far more sophisticated server side applications - knowing how to read and write data to the database and spatter out a few HTML pages will not be enough to get you by. The user experience is going to be very much the result of an intelligent back end, so if you are not already familiar with patterns such as the "Steady State Machine" you need to get researching and start to figure out how to implement them in your chosen backend technology. If you are a ".Net guy" and are still using ASPX forms, then you need to start looking at MVC.

    All of this is not to say that the old skills are dead, they most certainly are not, but if you are think of "up skilling" in the new year and further out, they are points worth thinking about, at least from my point of view.

    I'm looking forward to the feedback...


Comments

  • Registered Users Posts: 40,038 ✭✭✭✭Sparks


    While HTML 5.0 and mobile apps are very new and very sexy, I earn my mortgage payments working with legacy code (and I mean legacy here - think 20 years old and 30 million lines of code, all C and C++). And I will bet the contents of my pockets against the contents of your pockets that there are more people whose job looks like mine than there are whose jobs are all to do with the latest sexy fad in the industry (and I'll lay good odds as well that we're paid better, have better working conditions and lower stress levels). I've nothing against the new stuff - I've done the startup thing and while I hate the culture, I think the tools are excellent (some are only really good enough for prototyping but they're still awesome tools nonetheless - you just don't use a chisel to turn a screw, if you follow me).

    I just think, every time I hear a keynote or read an article about what The Next Big Thing will be, that frankly only a tiny minority of the industry is concerned with The Next Big Thing, and the rest of the industry is too busy with wildly interesting problems or wildly profitable software to be worried about it. Just 'cos something is built from legacy code and written in a 30-year-old language (or indeed, is running on 30-year-old hardware), doesn't mean it's dull or boring. Hell, every space project ever done uses hardware that's almost never less than ten years old...


  • Registered Users Posts: 2,023 ✭✭✭Colonel Panic


    I'm with Sparks. The paradigm always shifts, but that doesn't invalidate legacy code. Not that legacy is a negative term, functional languages, design patterns and modular design are nothing new, they just have snappy names now!

    I'm a big fan of doing something with the Next Big Thing but I still pay the bills writing C and C++ and I don't think it holds me back or hurts my career. I'm lucky in my current job that I get to do a mix of old and new.


  • Registered Users Posts: 7,157 ✭✭✭srsly78


    As I said in another thread, right now in work I am writing cli/c++ (.net c++ lookalike essentially) wrappers to let old c++ code interface with modern c# programs with fancy guis. Hilariously we are also replacing web frontend stuff with normal desktop stuff (pretty much the opposite of what everyone else is doing), coz the web stuff just does not handle high performance real-time stuff well. MVC = **** performance when you want to do serious stuff (trading systems have a lot in common with multiplayer computer games, from a development point of view).

    HTML5 is supposed to solve this, but many things are still not settled. Google supports experimental websockets and webgl but microsoft doesn't, it's all up in the air. So essentially it's not ready for industrial use as of today, but it's good to tinker with to get a head-start.

    Regarding stuff like WebGL (not supported by microsoft, they want everyone to use some new web-directx instead lol). It's pretty much exactly the same api calls as opengl-es. So even tho it's a "paradigm-shift" coz it's embedded in a browser canvas it's still EXACTLY THE FREAKING SAME as stuff I was writing in the 90s. Here is a cool webgl demo, but it seems you can't try it out anymore coz google sold it or something? http://googlesystem.blogspot.com/2011/09/body-browser-no-longer-google-service.html edit: Here's a similar webgl demo (will work on chrome, probably not with other browsers) http://www.biodigitalhuman.com/

    I haven't checked it out much, but I would hazard a guess that any websocket API will be exposed with functions exactly the same as old socket APIs (with microsoft as usual trying to be edgy and different from the standard - ie dicks).

    In summary: most experienced developers will just roll their eyes and say whatever.


  • Closed Accounts Posts: 18,056 ✭✭✭✭BostonB


    I end up working with a lot of old stuff and relatively little new stuff. I'm more a of jack of all traders hacker than a true developer. But it has always seemed to me, that a lot of new technologies, especially web technologies, are inappropriately used, in that they used very often where there is no need for them to be web centric. Resulting in systems and applications which are heavily compromised, in performance, maintainability, development time, and functionality, often replacing older systems which were vastly superior. I'm solely talking about in house applications on intranets.


Advertisement