13 votes

Where do you see the future of IT going?

So, what's the hottest new thing in IT today, what's that coolest new tech which might prove to be a goldmine some years down the line? The way PCs, websites, databases, programming languages, etc. used to be in the 90s or mobile computing used to be in 00s? Early 00s gave us many a goodies in terms of open source innovations, be it Web Technologies, Linux advancement and propagation through the masses or FOSS software like Wordpress and Drupal, or even the general attitude and awareness about FOSS. Bitcoin also deserves a notable mention here, whether you love it or hate it.

But today, I think IT no longer has that spark it once had. People keep mulling around AI, ML and Data Science but these are still decades old concepts, and whatever number crunching or coding the engineers are doing somehow doesn't seem to reach the masses? People get so enthusiastic about ChatGPT, but at the end of the day it's just another software like a zillion others. I deem it at par with something like Wordpress, probably even lesser. I'm yet to see any major adoption or industry usage for it.

Is it the case that IT has reached some kind of saturation point? Everything that could have been innovated, at least the low hanging fruits, has already been innovated? What do you think about this?

17 comments

  1. PantsEnvy
    Link
    It takes a long while for disruptive technology to actually disrupt things. The Innovators Dilemma covers this very well. It talks about how railroad companies were disrupted by car manufacturing,...

    People get so enthusiastic about ChatGPT, but at the end of the day it's just another software like a zillion others. I deem it at par with something like Wordpress, probably even lesser. I'm yet to see any major adoption or industry usage for it.

    It takes a long while for disruptive technology to actually disrupt things. The Innovators Dilemma covers this very well. It talks about how railroad companies were disrupted by car manufacturing, how personal computers disrupted IBM's mainframes, how digital photography disrupted Kodak's film business. These companies were all slow to react, because the disruption initially seemed unimportant in a smaller market segment. A more recent example is how Apple disrupted Nokia with mobile phones and Salesforce disrupted Siebel with SaaS software.

    There weren't a lot of early adopters of Apple phones. It was mostly people with a lot of expendable income in the tech industry. There weren't a lot of early adopters of Salesforce. It was mostly smaller sales departments that wanted something quick and easy. There weren't a lot of early adopters of the PC. It was mostly a few geeks.

    You could argue that the truly disruptive technology was the internet. And that the internet wasn't adopted widely until people bought PCs and smart phones in order to access SaaS software and social media and streaming and online stores etc...

    I think ChatGPT is a disruptive technology. It's not the thing you buy (specific smart phones), it's the reason you might buy something (to get online.) We don't know which products or companies it will disrupt. Google seems convinced it can completely disrupt their search engine, which is the core basis for their advertising income. Microsoft seems convinced it can disrupt their corporate software such as Office & Dynamics. They could be simply paranoid, because paranoia around disruptive technology is key to corporate longevity.

    Another potentially disruptive technology is AR/VR. If it ever gets unleashed in an elegant form factor, a lot of things will change in the technology space. Which kind of explains why Meta, Google & Apple are heavily invested in AR/VR.

    10 votes
  2. DawnPaladin
    Link
    I've been virtually attending the Microsoft Build developer conference this week. Microsoft definitely does not see large language models as a ho-hum, business as usual kind of deal—they're...

    I've been virtually attending the Microsoft Build developer conference this week. Microsoft definitely does not see large language models as a ho-hum, business as usual kind of deal—they're talking about it as an "invention of the mouse" kind of moment. They announced Windows Copilot, which lets you give natural-language instructions to your computer, like the ship's computer on Star Trek.1

    Microsoft wants to provide private LLMs that can understand and interact with all of your business data, which they say will revolutionize how work is done. Also, LLMs can read and write code; with the popularity of infrastructure-as-code, I expect IT to get shaken up significantly.

    Over the next few days I'm going to go over my notes and prepare a presentation for my coworkers; I'll post some of it in Skybrian's AI megathread. Microsoft said something like "Every app is going to be rewritten for AI." Even if half of that is hype, IT is in for an interesting time. Microsoft is going all-in on LLMs; that cannot help but affect the rest of the industry.

    1 Their demos mostly show typed instructions, not verbal, but it shouldn't be difficult to set up speech-to-text.

    7 votes
  3. 0d_billie
    Link
    A lot of the other comments in here have made the same points that I would already, except for one. As I see it, ChatGPT and the various other intelligences are not "mature" products. AI is not at...

    A lot of the other comments in here have made the same points that I would already, except for one. As I see it, ChatGPT and the various other intelligences are not "mature" products. AI is not at the stage of a technology's lifecycle where it has become commoditised and ubiquitous.

    Look at smartphones: in the first decade or so of smartphones existing, every new release felt like a giant leap forward. We added a camera! Now there's a camera pointing at your face! 4G! Video calling! Music streaming! Podcasts! Mobile gaming! Biometric security! etc. Compare that to today's increasingly iterative releases, where design and software are about the only thing that can differentiate products. We as a society and the market in general have figured out what the optimal use case for the tech is, and so we just see minor improvements year on year now. 10% more efficient, 12% larger battery, double the megapixels in your camera, etc.

    Looking at AI in its current state however, I think should be likened to the Motorola DynaTAC 8000X, the first commercially available mobile phone, with its talk time of 30 minutes and 10-hour charge time. In my view, we are right at the bottom of the sigmoid curve when it comes to AI, and the next few years are going to be incredibly explosive in terms of what researchers figure out AI can do.

    It's also important to remember that ChatGPT ≠ AI. There is a lot of other incredible stuff going on in the world of machine learning and artificial intelligence, and while ChatGPT gets the limelight, the brilliance of (for example) photo editing AIs should not be understated.

    7 votes
  4. [13]
    vord
    Link
    I have many thoughts of various positive and negative connotations. For one, computers are complicated, fragile, fickle things. They can and do break, and even the best identical-configuration...

    I have many thoughts of various positive and negative connotations.

    For one, computers are complicated, fragile, fickle things. They can and do break, and even the best identical-configuration tooling still will result in odd, irreproducable behaviors. The need for IT will be around as long as computers are incapable of repairing, replacing, and upgrading themselves.

    IT trends thus far have been quite cyclical. Cloud vs On-prem is far older than many think...Remoting to other services has been around as long as networking itself.

    The push to the latest iteration, vendor specific cloud services, could be the deathknell for it, given the nature of making it easy to get in and hard to get out. This worries me as it gives Google, Amazon, and Microsoft even more power to control the direction of the industry. Especially Microsoft, whom was able to make Teams a thing not because of its superior quality but by bundling it with their other stuff and making it "free." The hooks they have into the education system make vertical integration from them even more problematic than Apple's. There's a reason Wordperfect died even though it was just as good, if not better, than Word.

    I do think AI tooling is improving, and for art in particular its problematic. Programming/IT is more transitive with this shift, but art was already undervalued and being able to hurt the economics of artists (musicians/writers inclusive) is going to be a problem. I think AI will lose its appeal when costs go up as electricity costs do (and people get locked into their cloud-provider's AI).

    Companies still fail to use ERP systems correctly, and they've been around my whole life. I have my doubts that any newer tech will fix this fundemental problem. Amazing tools are no good if nobody uses them.

    Technological competance across the population of computer users is abysmal. The education and industry failed. The adoption of computers into the workplace should have turned everyone into programmers. If you can comprehend "If this, then that," you can learn to program.

    I forsee a great dropoff in knowledge when the 30+ crowd retires. I think the industry will recover, but there will be a period of pain there that was not felt prior.

    I firmly believe open source (specifically copyleft) software is the only proper tool for user empowerment. In a world run by software, user empowerment is the only way to insure that the stuff you buy is yours.

    I hope that we see a future where everyone takes on a blue collar trade, like plumbing, and a knowledge job like accounting or IT. Neither job would be fulltime year round, but wax and wane based on need.

    4 votes
    1. [5]
      NaraVara
      Link Parent
      If anything, I think it will be more pronounced and require greater expertise and more specific training on the part of IT people. I dramatically overestimated how normal it would be to have...

      The need for IT will be around as long as computers are incapable of repairing, replacing, and upgrading themselves.

      If anything, I think it will be more pronounced and require greater expertise and more specific training on the part of IT people.

      I dramatically overestimated how normal it would be to have functioning computer literacy skills in the future. I based my assumptions on my experience as an elder millennial growing up with technology. The kids we're hiring now, most of whom have CS degrees, are woefully uninformed about how computers actually work. Even the Data Scientists don't actually understand the statistics they're citing. They run these Python libraries and they have very rudimentary heuristics about how to interpret the numbers that come out, but they have no grasp on the underlying math.

      I think this erosion in fundamental knowledge will only get worse as AI further abstracts away the need to understand how things work underneath the hood. That'll be fine as long as you're on the happy path where all the tooling works fine, but when it fails people will be utterly helpless to fix without help. IT people will have to become general "handyman" types with information, knowing not just how to manage components and connections, but a lot of fairly sophisticated software and hardware interactions as well.

      6 votes
      1. [2]
        Akir
        Link Parent
        Of all the AI doomsday tales I've heard, this is the one that I'm most afraid of. It's simelar to the reasons why I think that trade secrets are a fundamentally bad thing; I don't want to live in...

        I think this erosion in fundamental knowledge will only get worse as AI further abstracts away the need to understand how things work underneath the hood.

        Of all the AI doomsday tales I've heard, this is the one that I'm most afraid of. It's simelar to the reasons why I think that trade secrets are a fundamentally bad thing; I don't want to live in a world where we could lose important information about how the things our society depends on works. Imagine Coca-Cola goes belly-up and we lose the recipe for Coke. Sure, that won't be bad, but imagine if we lose the ability to make OLED displays or high-performance CPUs.

        The good news is that it seems like there are enough people who actually do understand some of the most important fundamentals, and with so many secret designs being reverse-engineered these days I don't know if we'll ever get to the point that we will lose these things. Video game consoles, for instance, all have their share of proprietary "black box" components, yet today there are emulators available for all but the most recent ones. We even have implementations so accurate that we're simulating their imperfections as well.

        9 votes
        1. vord
          Link Parent
          There's a chance this happens anyway when the people who designed x86 (and maybe x86_64) die. There is a lot of voodoo going on under the hood, and some of it is almost certainly lost already. At...

          imagine if we lose the ability to make OLED displays or high-performance CPUs.

          There's a chance this happens anyway when the people who designed x86 (and maybe x86_64) die. There is a lot of voodoo going on under the hood, and some of it is almost certainly lost already. At least in terms of "why and how," if not the implementation.

          7 votes
      2. [2]
        skybrian
        Link Parent
        I’m wondering how much of this is forgetting what it was like to be a new grad. Thinking back to when I got out of college, there were all sorts of things I knew nothing about. (Though, a lot of...

        I’m wondering how much of this is forgetting what it was like to be a new grad. Thinking back to when I got out of college, there were all sorts of things I knew nothing about. (Though, a lot of stuff we take for granted now hadn’t been invented yet.)

        It seems like it should be easier than ever for a motivated beginner to learn a lot pretty quickly? How do we know this isn’t happening?

        2 votes
        1. NaraVara
          Link Parent
          I don’t know. My impression was always that new grads should have lots of solid, very current book knowledge that elders temper with real world experience. But peoples I’m encountering are often...

          I don’t know. My impression was always that new grads should have lots of solid, very current book knowledge that elders temper with real world experience. But peoples I’m encountering are often missing fundamental book knowledge.

          3 votes
    2. [7]
      FlippantGod
      Link Parent
      I used to think that, but "can" and "will" are different. Tooling is painful and there is too much prerequisite domain knowledge. The barrier to entry is still high. It (sort of) happened anyway....

      The adoption of computers into the workplace should have turned everyone into programmers. If you can comprehend "If this, then that," you can learn to program.

      I used to think that, but "can" and "will" are different. Tooling is painful and there is too much prerequisite domain knowledge. The barrier to entry is still high.

      It (sort of) happened anyway. Python has been widely adopted by the scientific community. JavaScript, somewhat differently, was learned by so many web developers they now write their own tools and infrastructure with it.

      Considering the trajectory of AI tools, I expect everyone will be able to generate programs. I also expect only a fraction of people actually will.

      2 votes
      1. [6]
        vord
        (edited )
        Link Parent
        This is why the education system failed. Its not harder to learn to use Make and a C compiler than it is to learn Excel and Word, it is just different. Over the course of my K-12 education 30+...

        Tooling is painful and there is too much prerequisite domain knowledge. The barrier to entry is still high.

        This is why the education system failed. Its not harder to learn to use Make and a C compiler than it is to learn Excel and Word, it is just different. Over the course of my K-12 education 30+ years ago, I attended 4 mandatory Microsoft Office classes, but 0 mandatory programming classes. (had to go to college in 11th to get beyond the elective Intro to Programming)

        The more complex tooling is courtesy of the IT/CS industry catering to college grads exclusively. If the average programmer only completed high school I'd bet the average programming language would be easier to use.

        2 votes
        1. [5]
          Adys
          Link Parent
          Hard disagree.

          Its not harder to learn to use Make and a C compiler than it is to learn Excel and Word, it is just different.

          Hard disagree.

          4 votes
          1. [4]
            vord
            (edited )
            Link Parent
            Much like anything, its as complex as you want it to be. Sure a makefile can be a giant complex monstrosity. It can also just be a really easy way to run a series of shell commands. The great lie...
            • Exemplary

            Much like anything, its as complex as you want it to be.

            Sure a makefile can be a giant complex monstrosity. It can also just be a really easy way to run a series of shell commands.

            The great lie is that command line tools are harder than gui tools. They are just harder and easier in different ways. And since most people are trained on a GUI as an end user, its harder for them to grok, because they try to map GUI skills onto CLI skills, and learning any new skill is harder than one you already know.

            Here, a tutorial:

            hi:
                echo "hello"
                echo "world"
            compile:
                g++ -o main.exe hello.cpp
            

            And off through an intro to C++ book. Run make compile when you need to. You don't need anything more than that, any more you need to learn how to use "=SUM()" in Excel. Those other things come as you progress.

            If you sit down a person who can read and write fluently, but has never seen a computer, I'd bet a nickle you could teach them the basic command line tools faster than how to use a mouse. I've seen it with my own eyes.

            Edit: to adopt a more accpeted idiom: Learning math and learning to read are not intrinsically easier or harder. Just different.

            Edit 2: And Excel itself is easier to learn if you can program, because you know that SUM() is a function and how parameters work.

            2 votes
            1. [2]
              noble_pleb
              Link Parent
              Only issue with C/CPP is that you need to write tons of code to do a small arithmetic or logical op, and when things like pointers get involved, the average human is bound to feel terrified! Maybe...

              Only issue with C/CPP is that you need to write tons of code to do a small arithmetic or logical op, and when things like pointers get involved, the average human is bound to feel terrified! Maybe Python is a neat example here, not only does it work at a higher level but has an ecosystem of libraries around it for almost anything you can dream of. And indeed, power users are definitely using python more and more for their daily IT work, it's no longer just programmer's language as of today.

              1 vote
              1. vord
                (edited )
                Link Parent
                To reply to you and @skybrian... Yes. However, part of the reason I chose C++ is because that was what was taught in our Introduction to Programming class in 1997, early high school. In a the...

                To reply to you and @skybrian...

                Yes. However, part of the reason I chose C++ is because that was what was taught in our Introduction to Programming class in 1997, early high school. In a the middle of a fairly poor red-state school district where the majority went on to be blue collar. And there were kids whom were C students, whose highest grades were in metal shop. Who couldn't tell between the monitor and the PC when asked. And they ended the class with a B. Solving problems in the same vein as Advent of Code.

                Imagine a world where solving Advent of Code was a high-school skill. We could have had that world. Instead we shot for mediocrity and the educators and politicians of the USA incorrectly assumed that using a computer would continue to neccessitate to know how to fix it, that the technological prowess of the late X/early millenials was a natural progression and not a fluke of widespread, but difficult to use, computers.

                1 vote
            2. skybrian
              Link Parent
              It’s possible to get started that way, but why do that when there are so many better ways nowadays? Segfaults are pretty user-unfriendly, and there’s no standard package manager for C. For a...

              It’s possible to get started that way, but why do that when there are so many better ways nowadays? Segfaults are pretty user-unfriendly, and there’s no standard package manager for C.

              For a beginner, I might suggest an online notebook-style programming environment. I’m most familiar with Observable which is based on Javascript and quite nice for making graphs, but maybe a Python-based environment would work better for some things?

              Even if they do want to stick with the command line, Go or Node would give you a huge number of packages to choose from without C’s footguns. Or I might give Deno a try, since apparently it will build standalone executables easily. (I am wary of Python’s fragmented package ecosystem.)

              1 vote
  5. skybrian
    Link
    I don’t know, but here are some things to think about: I expect that due to AI, creating demos that work (at least sometimes) will get increasingly easier. For a lot of tasks, that may be enough....

    I don’t know, but here are some things to think about:

    I expect that due to AI, creating demos that work (at least sometimes) will get increasingly easier. For a lot of tasks, that may be enough. You can do one-off tasks pretty easily. Some people do everything in spreadsheets and I expect there will be new tools like that, which might still look kind of like spreadsheets?

    Making software that can be trusted to work even in the face of increasingly sophisticated, hostile attacks is a whole different thing. Hard to say how that plays out. I expect having a trusted brand like Apple will still matter a lot.

    So, maybe we end up with more unreliable one-off code running in increasingly hardened sandboxes? Like web pages and Docker containers, but more of that.

    Hard security slowly improves in response to things like ransomware. Phishing should get harder with new kinds of authentication where you can’t give away your password. But other kinds of fraud remain very easy due to widespread sharing via social media.

    2 votes