• Activity
  • Votes
  • Comments
  • New
  • All activity
  • Showing only topics with the tag "intelligence". Back to normal view
    1. Apple Intelligence doesn't work the way I want it to

      Recently I did an update on my Macbook and it started showing alerts about Apple Intelligence. I've heard a little bit of marketing about this but I haven't really spent any time trying to figure...

      Recently I did an update on my Macbook and it started showing alerts about Apple Intelligence. I've heard a little bit of marketing about this but I haven't really spent any time trying to figure out if it is just hype. Well, I've tried it a few times and I'm completely underwhelmed.
      One of marketed features is that Siri is much improved. That would be nice, I thought, because there are only a few use cases like "Set an Alarm" where Siri could ever do anything besides a google search.

      So there are two times recently I tried to use this improved Siri to solve a problem. My background using AI: I use Copilot at work. I get mixed results for it, but it does use my local context (open files etc) and is able to ask follow up questions if my prompt is too vague.

      First Use Case: I want to solve a technical problem on my laptop

      • My Prompt: "Can you help me fix Discord so that audio is shared when I share a video stream"
      • My Expectation: Maybe an AI summary of the cause of the issue. Maybe open up system settings or open up Discord or give an explanation of why this is a technical problem on Macs.
      • Actual Siri Response: Does an internet search and shows some links. Essentially just did a google search which I could have done by typing the same prompt in a browser.

      Second Use case: I want help finding a file on my laptop

      In this case, I made a summary of my finances on my laptop a few months ago. I can't remember what I named the file or what kind of file it was. Maybe a spreadsheet? I know it was on my local computer.

      • My 1st Prompt: Can you help me find a specific file on my computer
      • My Expectation: Maybe some follow up questions where it asks me for a date range or something that is inside the file. Yes, I know that I can do this in Finder but I want Apple Intelligence to save me a few minutes.
      • Siri: Shows the result of a web search on how to find files on a computer. The first few results are for Microsoft Windows
      • 2nd Prompt: Can you help me find a specific file on my mac
      • Siri: Tells me to use Command-space and use the search

      In both cases, Siri just acted like a shortcut to a google search. It didn't even recognize that I was asking the question on a Mac. This is same as Siri has always been. I assume that it can still figure out to set a timer and do a few things, but it doesn't seem to be working in a way I would expect an AI to work at all.

      28 votes
    2. Would you want to work for a company that uses a coding test to select workers, even for non-coding positions?

      I'm in the midst of an interview process with an employer that insists on an "Introduction to Algorithms"-type test for all of its white-collar workers. Their claim is that it selects for "smart"...

      I'm in the midst of an interview process with an employer that insists on an "Introduction to Algorithms"-type test for all of its white-collar workers. Their claim is that it selects for "smart" people. [I'm anxious because my relevant coursework was many years ago, and there's no way I'll have time to master it again before the scheduled test - there's some age bias, noted below.]

      Based on review of Glassdoor's comments about this company's interview process and demographics, what they really want is recent college graduates with fresh CIS degrees that they can abuse and use up quickly, giving them no market-relevant skills in the process. The product relies on an obscure, specialized database architecture and elderly front-end code.

      However, the company is a market leader in my industry, and I'm interested in working there in a customer-facing technical liaison/project management role because the product is better fitted for task, has better support and customization, and better interoperability than anything else. There's huge R&D reinvestment as well, and the company is just that little bit more ethical in the marketplace than its competitors.

      Do you believe that the ability to do sorts and permutations in code genuinely selects for general intelligence, and would you want to work with a population of people who all mastered this subject matter, regardless of their actual job title?

      14 votes
    3. Does de-humanisation of others occur automatically, as soon as we believe that we can predict their actions?

      Dear Tildes community, this is an issue that's bugged me for some time. I might struggle to put this into the right words initially, because I have not studied either philosophy, psychology,...

      Dear Tildes community,

      this is an issue that's bugged me for some time. I might struggle to put this into the right words initially, because I have not studied either philosophy, psychology, biology, sociology or anthropology. Yet, all of those fields could input into this. I will edit this post to clarify things once people start commenting.

      I will begin by stating the question at the root of the issue I am trying to explore:

      Does de-humanisation of others occur automatically, as soon as we believe that we can predict their actions?

      Things to consider:

      • What is a measure of 'humanity'? Is it consciousness? Self-awareness? Intelligence? Empathy?
      • Is it true that a more 'conscious' or 'intelligent' creature is closer to us in nature and therefore should enjoy more rights, considerations, or respect? (Case in point: Some countries will not allow performing surgery on an octopus without anesthesia, due to them being considered very high up on the ladder of consciousness)
      • It is easy to conflate consciousness and intelligence. I think that's a bit of a trap. I have often looked at intelligence as a sort of "clock rate" of the brain. As in, you might be able to process information very quickly, but that's still pointless if you're running the wrong algorithms, or have very little knowledge to rely on. Intelligence all by itself is not a good measure of how 'conscious' or 'aware' or 'human' something is. Often, however, people tend to call animals more intelligent or less intelligent when they mean 'more highly developed', or 'more conscious'. The same probably applies to people as well.
      • Additionally, among self-aware, conscious beings (humans), empathy and intelligence van cary wildly. Therefore, does consciousness, or even 'human-ness' vary? Is a highly intelligent psychopath less human than a much less intelligent but empathetic person?
      • What do we use to assess whether a human is highly developed, or less developed / desirable? (Brushing aside the notion that we obviously shouldn't do so). I think it is important to look at what mechanisms have been used in the past to demonise swathes of the population, in order to discredit them or further some kind of agenda. Take African people during the slave trade. They were called primitive, less intelligent, less human. In fact, in more subtle ways this even happens to women nowadays. They are constantly belittled by chauvinists, for supposedly being less intellectually capable due to their gender. Are these all forms of de-humanisation, linked predominantly to intellect?
      • What is this founded upon? Is it predictability of their actions? Let's try to go full circle. How does one discredit a part of the population? One observes them and demonises their behaviours (and with that, culture, etc.) The predictability of such behaviours is essential in this. You cannot reliably say that "those brutes do [x], how disgusting", without there being frequent evidence of it actually happening. (On the flip-side, could people be predictably advanced or developed?)
      • What do we think of predictable people in general? Predictability has negative connotations. At best it's boring (say, a highly intelligent beaurocrat), at worst, stupid / less human (say, racists talking about another culture being predictably primitive)
      • Is there an implication of people, or beings, who are more predictable, having less free will? If your intellectual faculties are limited, or you operate on instinct more than you do on rational or logical deduction, you become more predictable, ergo, predictability == stupidity. (I know this is a fallacy, but I am trying to establish why one might irrationally and subconsciously dehumanise, not arguing in favour of this dehumanisation or trying to defend it)
      • Take our favourite pets. Cats and dogs. They are pretty highly developed and if it wasn't for humans, they'd be unchallenged apex predators ruling the world. They display complex behaviours, at times even hard to predict ones. But still, they are animals and behave in reasonably reliable patterns. They are also not able to pass the mirror-test for self awareness, implying they are not (or only in extremely limited ways). So, one could argue they are less human, less intelligent. Now look at insects. Even less intelligent. Even though it could be argued that some (like ants) display a form of swarm intelligence, they are still extremely predictable. (Except for, perhaps, the flight patterns of flies or mosquitoes, which evolution has scrambled into extremely random patterns to avoid them being swatted. But that's just hard-coded into their genes, not an intelligent thought process)
      • So, once more. Think of someone you really don't like. Do you ever call them stupid for their actions? Would you ever say "here we go again, they are doing this again". Particularly if they are your boss? Perhaps it helps you cope with their shitty behaviour to dehumanise that person. Make them a lesser human being, to compensate for the fact that they make you feel powerless in their work. If dehumanisation is such an immediate and convenient mechanism to protect yourself from feelings of inferiority, or to stop yourself from being threatened (say, by a different culture), perhaps it is in fact an ingrained behaviour, which expresses itself on a larger scale once fueled by propaganda and political intent. If we identify it and understand how it happens, we may protect ourselves against it by elevating others to a higher status of 'human-ness'.
      • When we 'have figured someone out', we are stating we can predict them. Are we putting them beneath us, henceforth? Are they 'less' than us in some ways? It gives us power to be able to predict, so it makes us more powerful than them in some way, so it makes them lesser beings in some ways.

      Why am I bringing all this up? In my life, so far, I have gone from being very insecure, mistrusting and scared of people, to much more open, trusting and confident.

      The more insecure I was, the more time I spent trying to prove to myself that I was somehow superior to others. Generally using intelligence as an argument (uggggh....). You know, like the goth teenager sitting in their basement, who is oh-so-individual and everyone else is so stupid and nobody understands my pain, etc. (see, dehumanising my past self right there, haha).

      The more I started trusting people and the more I started seeing everyone around me as humans, humans just like me, the more I began to see how others still apply these weird dehumanisation mechanisms to make themselves feel superior. This made me wonder whether there is some kind of innate drive to do so. Try to predict others, or paint them as predictable, to prove that you are superior to them, because they would not be able to predict your actions, as you are so far beyond their capabilities.

      So yeah, uhm....let me know what comes up in yer heads as you read through this, I'd be most interested to hear your perspectives.

      5 votes