• Activity
  • Votes
  • Comments
  • New
  • All activity
  • Showing only topics in ~tech with the tag "tech industry". Back to normal view / Search all groups
    1. An opinion on current technological trends

      For a while now I am personally completely dissatisfied with the direction the (mainstream)technology is taking. Almost universally the theme is simplification on end user facing side. That by...

      For a while now I am personally completely dissatisfied with the direction the (mainstream)technology is taking.

      Almost universally the theme is simplification on end user facing side. That by itself would not be so bad but products that go this route currently universally include loss of control of the user including things I would not have believed would be accepted just a decade or so ago. Forced telemetry(aka spying on user habits), forced updates(aka forcefully changing functionality without consent of the user), loss of information - simplification of error messages to absolute uselessness, loss of customization options or their removal to parts that are impossible to find unless you know about them already, nagware and bloatware and ads forcefully included in base os install. And that is simply the desktop/laptop environment.The mobile one is truly insane and anything other "smart" is simply closed sw and hw not regarding user agency at all.

      Personally I consider the current iteration of "just works" approach flawed, problems will inevitably arise. Withholding basic information and tools simply means that the end user does not know what happened and is dependent on support for trivialities. I also consider various hmmm, oops and such error messages degrading and helping to cultivate a culture of technological helplessness.

      To be honest I believe the option most people(generally) end up taking of disinterest in even the superficial basics of technology is an objectively bad one. Computing is one of the most complex and advanced technologies we have but the user facing side even in systems such as Linux or Windows 7 and older is simple to understand and use effectively with minimal effort. I do not believe most people are incapable of acquiring enough proficiency to for example install an os or take a reasonable guess at what a sane error message means or even understand the basics of using a terminal, they simply choose to not bother. But we live and will continue to live in a technological world and some universal technological literacy is needed to prevent loss of options and loss of agency of the end user. The changes introduced in mainstream sw are on a very clear trajectory that will not change by itself.

      I have this vision of a future where the end user interacts solely with curated LLM systems without the least understanding of what is happening, why it is happening or who makes it happen. The blackbox nature of such systems then introducing subtle biases that were not caught in brute force patches over the systems or simply not caught, perpetuating who knows what. Unfortunately I do not think it is sufficiently unlikely by the current trends.

      Up to a point I get not wanting to deal with problems with technology but instead roadblocks are introduced that are as annoying to get through with the difference that they will not stay fixed. Technology is directing massive portion of our lives, choosing to not make an effort to understand the absolute surface of it is I think not a sound decision and creates a culture where it is possible to introduce disempowering changes en masse.

      So far this has been a rant honestly and perhaps I just needed to vent but I am actually interested in the thoughts of the community on this broad topic.

      37 votes
    2. Are we stuck on a innovation plateau - and did startups burn through fifteen years of venture capital with nothing to show for?

      The theses I would like to discuss goes as follows (and I'm paraphrasing): during the last 15 years, low interest rates made billions of dollars easily available to startups. Unfortunately, this...

      The theses I would like to discuss goes as follows (and I'm paraphrasing): during the last 15 years, low interest rates made billions of dollars easily available to startups. Unfortunately, this huge influx of venture capital has led to no perceivable innovation.

      Put cynically, the innovation startups have brought us across the last 15 years can be summarized as (paraphrasing again):

      • An illegal hotel chain destroying our cities
      • An illegal taxi company exploiting the poor
      • Fake money for criminals
      • A plagiarism machine/fancy auto-complete

      Everything else is either derivative or has failed.

      I personally think spaceX has made phenomenal progress and would have probably failed somewhere along the way without cheap loans. There's also some biotech startups (like the mRNA vaccines that won the race to market during covid) doing great things, but often that's just the fruits of 20 years of research coming to fruition.

      Every other recent innovation I can think of came from a big player that would have invested in the tech regardless, and almost all of it is "just" incremental improvements on several decades old ideas (I know, that's what progress looks like most of the time).

      What do you think? Do you have any counterexamples? Can you think of any big tech disruptions after quantitative easing made money almost free in 2008?

      And if you, like me, feel like we're stuck on a plateau - why do you think that is?

      83 votes
    3. Where do you see the future of IT going?

      So, what's the hottest new thing in IT today, what's that coolest new tech which might prove to be a goldmine some years down the line? The way PCs, websites, databases, programming languages,...

      So, what's the hottest new thing in IT today, what's that coolest new tech which might prove to be a goldmine some years down the line? The way PCs, websites, databases, programming languages, etc. used to be in the 90s or mobile computing used to be in 00s? Early 00s gave us many a goodies in terms of open source innovations, be it Web Technologies, Linux advancement and propagation through the masses or FOSS software like Wordpress and Drupal, or even the general attitude and awareness about FOSS. Bitcoin also deserves a notable mention here, whether you love it or hate it.

      But today, I think IT no longer has that spark it once had. People keep mulling around AI, ML and Data Science but these are still decades old concepts, and whatever number crunching or coding the engineers are doing somehow doesn't seem to reach the masses? People get so enthusiastic about ChatGPT, but at the end of the day it's just another software like a zillion others. I deem it at par with something like Wordpress, probably even lesser. I'm yet to see any major adoption or industry usage for it.

      Is it the case that IT has reached some kind of saturation point? Everything that could have been innovated, at least the low hanging fruits, has already been innovated? What do you think about this?

      13 votes
    4. Honest question: Are Windows or Linux laptops more suited for freelancers?

      I know it's a technical question but I want to know specifically from freelancer perspective. A freelancer's decision making differs from that of regular corporate worker in this regard due to...

      I know it's a technical question but I want to know specifically from freelancer perspective. A freelancer's decision making differs from that of regular corporate worker in this regard due to many reasons:

      1. Freedom to choose: Unlike corporate, a freelancer isn't imposed any process or specific software guidelines to follow. They're free to use Linux and open source if they want to.
      2. No team compatibility: A freelancer can work on specific project with a geographically distant team but they don't have to submit to any long-term compatibility constraints.
      3. Budget constraints: A freelancer can't typically afford costly licenses. With corporate, they can scale well and bring down the licensing costs which isn't true for freelancers. Hence, open source software is typically more suited to their workflow (even when using a Windows OS).

      Given all these factors, do you think a Windows or Linux laptop is more suited for a typical Freelancer? What do you happen to use?

      4 votes
    5. What will "classically trained" look like for computer science and digital literacy?

      This might be a weird framing but it's been bugging me for a few days. Many fields have a concept of classical training -- this is most common in music but applies in the humanities and many other...

      This might be a weird framing but it's been bugging me for a few days. Many fields have a concept of classical training -- this is most common in music but applies in the humanities and many other areas. For example I do a lot of CAD work for my job, but I received what I would consider a "classical education" in design...I learned to draft by hand and physically model before I was ever allowed to work digitally. I got a lot of value out of this approach and it still informs the way I work today.

      A lot of people view computers and technology as modern and almost anti-classical, but as the tech industry matures and the internet moves from something shiny and new to something foundational to our society, what will the new classicism look like?

      Thanks for reading my question.

      14 votes
    6. CES: We visit the tech industry's scary vision for the future

      the It Could Happen Here podcast did a 3-part series on this year's Consumer Electronics Show in Vegas, and I thought it was some of the most nuanced and interesting coverage I've seen. 1: The...

      the It Could Happen Here podcast did a 3-part series on this year's Consumer Electronics Show in Vegas, and I thought it was some of the most nuanced and interesting coverage I've seen.

      1: The dead future of Big Tech - host Robert Evans got his start in journalism doing tech reporting more than a decade ago, including covering CES. he reflects on how the show, and the tech industry as a whole, has changed over that time.

      2: The good parts of our future tech dystopia - Robert and co-host Garrison talk about the good / promising parts of what they saw at the show

      3: We visit the tech industry's scary vision for the future - discussion of the creepy / less good stuff they saw at CES, including lots of surveillance cameras & robots

      8 votes
    7. What are some of the best blogs, journals, e-magazines, etc. about programming or software development in general?

      I'm a solo freelance programmer who codes on small to medium sized projects, and I realize that I can upskill myself a lot by keeping up with the industry trends, by listening to what the best in...

      I'm a solo freelance programmer who codes on small to medium sized projects, and I realize that I can upskill myself a lot by keeping up with the industry trends, by listening to what the best in this field have to say. The problem is that there is just so much information overload everywhere, just so many youtube videos and articles that it seems overwhelming to differentiate the wheat from the chaff!

      Since reading is my preferred medium of instruction, I want to know what are the blogs, journals, etc. on this topic with some street cred? And preferably individual experts and blogs, not companies. Company or corporate sites and blogs seem to be more hype than substance these days.

      Which ones do you refer for keeping up to date?

      8 votes