Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Why has web development become so popular?

Options
  • 18-05-2012 12:21pm
    #1
    Registered Users Posts: 507 ✭✭✭


    Hi folks,

    Ive been looking around at jobs lately and 90% of the .net jobs seem to be related to asp.net/ c#/ mvc /javascript / jquery / ajax

    Im just wondering how come this is all people seem to be using nowadays.

    I've lots of experience in Winforms and Im starting to get worried my CV is starting to look very stale...

    Is there that many new Web Development companies or are people starting to develop internal software applications using these technologies aswell??

    Thanks...


Comments

  • Registered Users Posts: 586 ✭✭✭Aswerty


    There is a lot of advantages to web applications including no installation, no upgrading, access from any internet enabled computer and a friendlier (sometimes cheaper) cost model (SaaS). And that's just for the user, vendors benefit through no longer needing distribution channels, a continuous revenue stream and an anytime ability to apply patches and new features.

    The downsides to web applications used to be poor responsiveness but current technologies, mostly javascript based, have overcome this issue. So these days unless you have demands that only a desktop application can meet I can't see much of a reason not to deploy over the internet. A number of heavy duty applications are still desktop based because they require either rapid response between the presentation and business logic layer or have heavy graphical requirements and/or the market has not put any pressure to move to the web. Lack of pressure can be due to regulatory requirements (e.g. say in the Pharma sector) or sensitive data (i.e. any business wanting to keep they're data completely internal) making web deployments nonviable. Vendors with web application have countered this by making there product purchasable for internal deployment over a company's private network. This is not a perfect solution because it reintroduces the idea of product versions and reintroduces the hassle of patching and adding new features it does remove installation hassle which is very important for companies that do not give they're employees administrative rights.

    As the cloud becomes more and more a platform for businesses we will see less and less desktop applications. We're not going to see the end of desktop applications in the immediate future but I think a lack of competency in web development is going to seriously curtail job choice more and more if things keep going the way they are going.

    This is pretty much just in regards to line of business and social/entertainment software. Embedded development, drivers, etc are not going to be going onto the could anytime soon.


  • Registered Users Posts: 113 ✭✭lenovoguy


    Aswerty wrote: »
    There is a lot of advantages to web applications including no installation, no upgrading, access from any internet enabled computer and a friendlier (sometimes cheaper) cost model (SaaS). And that's just for the user, vendors benefit through no longer needing distribution channels, a continuous revenue stream and an anytime ability to apply patches and new features.

    I'd be in complete agreement here TBH. The browser is fast replacing big honking installed applications for lots of applications.I work for a dev shop where the product is a Win32 application that has a raft of plugins and components implemented in .NET. and from a development, user and support perspective the web beats desktop hands down.

    Installed apps, especially on Windows can be a nightmare due to the mess of having to create installers, requiring the right user permissions, having to register program components in the Windows Registry, making sure uninstallation doesn't leave behind anything it shouldn't, catering for multiple operating system versions and their various idiosyncrasies, countering piracy and reverse-engineering, having to deal with file formats and versioning issues; it's a complete saga. These issues cause all manner of support tickets due to the myriad different operating system, hardware and security configurations users and companies employ and are a real distraction from the core business.

    Compare this with a typical web-app - you point your browser at a URL, login if necessary and that's it. The user spends less time worrying about administrative nonsense and more using the product. The vendor doesn't have to worry about piracy as the app is either free, or the user needs a login supplied on purchase to use the software. There is less chance of competitors ripping off your code as that's all stored on the server apart from JavaScript but hopefully there are no trade secrets going down in client-side code.

    Additionally, there's no cranky licensing system to have to deal with, you have a login and possibly a paypal account associated for any paid services, and that's all. There's no activation/de-activation required for new installations or license transfers. For our software, for some reason we have to activate licenses manually, so any customers wishing to activate outside of Irish office hours have to just..wait.

    The two points above enormously simplify support with web apps and free up support staff to deal with _real_ problems like defects in the actual functionality rather than the environmental stuff.

    Development-wise, most web-apps are implemented using dynamically-typed, garbage collected scripting languages like JavaScript, PHP and Ruby, so there's no waiting around for compilation, so you can see any and all changes you make immediately, which has to lead to a massive productivity gain. I know for the likes of Facebook they have hacks to cross-compile PHP into C++ to realise speed improvements but I don't think that's a common solution.

    On top of that, with web-apps, you're typically building on top of open source frameworks like Rails, Node.js or jQuery that are regularly updated and have a thriving community of users.

    Where I work, we have to wait about half an hour for compilation, and that's with the assistance of a distributed build tool which delegates the compilation task to other available LAN nodes. Without it, compilation takes about 5 hours for us, and you really need to keep your wits about you when editing certain header files or they can kick off a full rebuild again. As well, we're tied in to aging, proprietary frameworks like .NET/MFC and a few others which puts us in a very precarious situation when it comes to planning ahead and making technology decisions down the road.

    I've come to same conclusion as OP, that the writing is on the wall for the desktop app broadly speaking, so I've been developing mobile and web-apps in my spare time, nothing fancy at all, but just something to put on my CV and keep abreast of technological developments. In general though, it seems that if you're developing on the desktop today, you're doing it either for very domain-specific reasons, there is no business case to do it any other way or the decision-makers are just stuck in their ways. I've fired CVs out to a few mobile/web shops this week as it seems to be a sinking ship from an employment perspective.


  • Registered Users Posts: 7,157 ✭✭✭srsly78


    Bit of balance please lads... You haven't pointed out the situations where web apps completely fail, usually with anything requiring high performance. Ever tried to do full duplex low-latency publish/subscribe over WCF? :D Seen some big projects crash and burn because the new shiny web app ended up being x100 slower than the old c++ winforms app it replaced.

    Sure for 99% of stuff it's fine and even better than the "old way".


  • Registered Users Posts: 2,022 ✭✭✭Colonel Panic


    In agreement with srsly78. The right tool for the job, not the latest shiny thing. Managing dependencies, varying network environments and installations are manageable downsides that I think *can* be worth it.

    From Lenovoguy's post history, his outlook is tarnished by the fact that he works for a total cowboy outfit and won't look for a better job.

    Most large apps tend to be a composite of desktop, web and server side stuff. It's all about knowing a little bit about everything and trying to be an expert in one or two things


  • Registered Users Posts: 1,922 ✭✭✭fergalr


    Obviously, right tool for the job, etc; maybe the web won't be the platform for everything; certainly its not suitable for high-end games - yet.

    But, mostly, this is a discussion for 2004.

    That's when changes in web standards, and the advent of technologies like AJAX made it possible to write highly interactive apps on the web. One of most peoples first experiences with this, being GMail.

    http://en.wikipedia.org/wiki/Web_2.0
    I'm not being facetious by providing that link - that article is an ok jumping off point for the entire discussion about the web becoming the dominant platform - a discussion that really did start back in 2004 - including many of the points about being able to deploy the application to the browser each time the user comes to the website, hence not having to manage many legacy clients etc.


    Since then, one of the major changes in the field has been the rise of rich apps on the smartphone. People are now talking about a future where you need to support the tablet as a major, or possibly dominant form of interface. Whether this will happen through html5 etc or through rich native apps is all playing out at the moment.


    Obviously, there's a time lag between when people perceive that one of these seismic platform shifts is happening, and when it starts to effect narrow small scale verticals, e.g. specialised in house business software, but the discussion isn't new, and the shift from desktop to web seems to be *mostly* a done deal at this point; people are probably worrying more about the potential shift from web to mobile/tablet.


  • Advertisement
  • Registered Users Posts: 2,022 ✭✭✭Colonel Panic


    I think high performance web-based games are close. If not via WebGL then via the likes of NaCl on Chrome, something I've been putting a lot of time into for the past couple of months.

    What do you think of my comment about modern apps being a composite of web and desktop, or native tablet applications?


  • Registered Users Posts: 1,922 ✭✭✭fergalr


    I think high performance web-based games are close. If not via WebGL then via the likes of NaCl on Chrome, something I've been putting a lot of time into for the past couple of months.

    I think webGL is really interesting.
    I've done a bit of openGL programming before. I know the programming model, so, selfishly, I hope it gains adoption!

    I think people could create amazing domain specific tools in WebGL - a "D3.js for WebGL" would be cool.
    There's a lot of applications where WebGL wouldn't ever be relevant.

    I'm really interested in different ways of working with data, visualising data etc, so I'd like to see it gain traction. In one way, we've had little innovation in our basic ways of interacting with data in the last 10 years; theres a sense in which not a whole lot has changed since the first GUI spreadsheets - same buttons, dialogs etc. This is supposed to be the information age, why is it sill so difficult to manipulate information? Wheres my jetpack? :-)
    Mobile and touch have moved things forward a little, I think.


    So, webGL would be cool if you could take it for granted, and it ran fast enough everywhere. It seems so hard to standardise anything on the web, though, I wouldn't be holding my breath.

    The other issue is always security.
    Allowing untrusted sites to run code on your GPU is pretty interesting.
    I don't know the current state of play with GPUs, but in general, I don't think GPU layers are designed with security in mind, so there's probably a whole set of issues to fix there, too.

    What do you think of my comment about modern apps being a composite of web and desktop, or native tablet applications?

    I don't know. At one level, that seems like a fair enough comment to me. Sufficiently large apps these days probably do need to support many types of interface and client.

    At another level, I guess the answer is always 'it depends'.

    There are modern apps that run on just one of: native tablet, mobile, web, desktop; and all subsets of those.
    There is a lot of code still being written for embedded systems, etc...

    I find it hard to generalise; maybe I'm just not in a position of knowledge on this.


  • Moderators, Technology & Internet Moderators Posts: 1,334 Mod ✭✭✭✭croo




  • Registered Users Posts: 586 ✭✭✭Aswerty


    In agreement with srsly78. The right tool for the job, not the latest shiny thing. Managing dependencies, varying network environments and installations are manageable downsides that I think *can* be worth it.

    I'd hardly call the internet the latest shiny thing. I see this argument a lot and much of the time it is valid but progressive steps are being made in computing and they are anything but the new fad. I'd of course go with the best tool for the job but as I see it this means going for a web application if it is technically viable. As I mentioned in my earlier post there is still a number of reasons that a desktop application is required and I don't think anyone has disagreed with these. The web just makes deployment so much easier as long as the latency and the resource cost of sending data from client-server is acceptable.

    I'm not involved at all with app development but there looks to be a number of reasons why application installation currently trumps server side deployment. These include:

    Mobile broadband is significantly more expensive than traditional broadband and installing apps reduces bandwidth consumption. This issue will resolve itself once the mobile infrastructure becomes more robust.

    There is a lot of mobile OS enhancement apps that need to be locally installed. Most of these will end up being canabalised as the likes of Android and iOS mature.

    The unification of the traditional web with the mobile web can be side stepped by creating an app. I don't think this will persist since many internet gurus are promoting a unified web as opposed to splitting it into the web and the mobile web.

    The pricing model for apps makes more sense than the webs subscription model when the cost is so low. I don't see this going away but the low cost means low development hours which typically means a terrible product, which seems to be the case for the 99% of apps. The particularly useful apps can be good but they should end up with a huge user base which should make significant investment in the app financially viable.
    croo wrote:

    Interesting article.


  • Registered Users Posts: 2,022 ✭✭✭Colonel Panic


    Aswerty wrote: »
    I'd hardly call the internet the latest shiny thing. I see this argument a lot and much of the time it is valid but progressive steps are being made in computing and they are anything but the new fad. I'd of course go with the best tool for the job but as I see it this means going for a web application if it is technically viable. As I mentioned in my earlier post there is still a number of reasons that a desktop application is required and I don't think anyone has disagreed with these. The web just makes deployment so much easier as long as the latency and the resource cost of sending data from client-server is acceptable.

    I'd hardly call it that either. I'm more talking about Software as a Service and cloud based applications and a complete replacement for the desktop and technologies and frameworks that forsake performance to be easier to develop with not always being the best solution.

    I have developed applications "in the cloud" that just were not suitable for the medium so I've been on the receiving end of it. I've also developed web applications that are very suited to the cloud, but end users only see a desktop or mobile app and their data synced everywhere.


  • Advertisement
  • Registered Users Posts: 586 ✭✭✭Aswerty


    Well I think we are on the same page then.


  • Registered Users Posts: 12,342 ✭✭✭✭starlit


    My honest opinion, the most popular web programming languages are easier to learn and code not as messy compared to learning the likes of Java, python and VBA ugh.

    I like HTML, CSS and PHP and enjoyed coding in them while I were in college.

    I have noticed that too while looking for IT jobs in particular web programming/design related jobs.

    I was good at cloud computing in college but still cannot get to grips with what is all about really in terms of jobs. All I can see its storage related....


  • Registered Users Posts: 7,157 ✭✭✭srsly78


    Laugh at VBA all you want, vba guys get paid the most (finance analyst jobs) :pac:


Advertisement