tech-taters's recent activity

  1. Comment on What's a quantum computer? in ~tech

    tech-taters
    Link Parent
    Thanks for the article! That was an interesting read. You’re right that it’s not as simple as I made it out to be. I read a bit more, and it seems like the additional difficulty is primarily with...

    Thanks for the article! That was an interesting read.

    You’re right that it’s not as simple as I made it out to be. I read a bit more, and it seems like the additional difficulty is primarily with Asymmetric Encryption. My understanding was “just plop AES 256 in” but that is evidently not close to the whole picture. For Symmetric Encryption, it does sound reasonably easy to upgrade.

    5 votes
  2. Comment on What's a quantum computer? in ~tech

    tech-taters
    Link Parent
    Not a quantum expert, but I am a “conventional” computer engineer. Essentially, most encryption algorithms are “secure” because the amount of computation required to guess the right answer is...

    Not a quantum expert, but I am a “conventional” computer engineer. Essentially, most encryption algorithms are “secure” because the amount of computation required to guess the right answer is functionally infinite. Like “all computers on earth, running for 1 million years” territory.

    Quantum computers posed a threat to certain encryption algorithms, but not all. This is because the math used to obscure the right answer happens to be easily guessable by quantum computers.

    There have been known “quantum proof” encryption algorithms for decades, they just weren’t used for one reason or another. These are quantum proof simply because the obscuring math happens to not be easily done by quantum computers. They end up in the same case of “you need infinite compute” to break the encryption.

    These algorithms weren’t designed to be quantum proof. They just happened to use different math for the encryption.

    Most systems should/have/will update to quantum proof algorithms in the near future, because it’s fairly easy to upgrade and the algorithms are just generally better at all aspects of encryption.

    9 votes
  3. Comment on What is your 'Subway Take'? in ~talk

    tech-taters
    Link Parent
    I can think of one, the first or second exit off the Oakland Bay Bridge going west into the city. But I generally agree with you, they’re not common.

    I can think of one, the first or second exit off the Oakland Bay Bridge going west into the city. But I generally agree with you, they’re not common.

  4. Comment on Tips/guides to turn my home into a smart home? in ~tech

    tech-taters
    Link
    I am just getting started with my smart home functionality too. I opted to go for HomeKit for the privacy and offline functionality. I think it's ridiculous that ecosystems exist that do not work...

    I am just getting started with my smart home functionality too. I opted to go for HomeKit for the privacy and offline functionality. I think it's ridiculous that ecosystems exist that do not work if your internet goes down. The only argument is that you don't need to buy a hub device, which is silly when you're buying $20 light bulbs, $30 switches, and $500+ shades (well in my case). It's not a cheap project.

    I have an ethernet Apple TV as the Home Hub, and it also acts as my Thread Border Router. I am trying to only buy Matter over Thread devices, but have a couple Matter over Wifi light bulbs. I have an Eve smart plug that is MoT and does a good job of expanding Thread range to my MoT roller shade in our primary bedroom.

    I have an Unraid server that I might run HomeAssistant on at some point. I think going down that route is powerful, but from my understanding, not simple. I think that's edging into "Home Labbing is my hobby". At least for my basic house functions, I don't want to worry about it. My server is also far from robust.

    My device list is currently:

    • AppleTV 4K (Ethernet HomeKit Hub and Thread Border Router)
    • 2x LIFX RGB bulbs (Matter over WiFi)
    • 1x Eve Smart Plug (Matter over Thread)
    • 1x SmartWings Roller Shade (Matter over Thread)
    • 1x Ecobee Thermostat (HomeKit over Wifi)

    I am trying to stick with Matter, partially to support the open standard, partially to cover-my-butt in the event I want to change ecosystem. I think Thread mesh networks is a cool idea, and I like that the devices are not directly connected to my WiFi network. I am always skeptical of IoT device security. I think having them on a separate protocol and 100% offline is a good reduction/diversification in the risk area. (until of course we get Thread-based malware lol)

    I am somewhat happy with the feature support from the HomeKit ecosystem. There are some features like "Dim over time" that you can't do without some HomeAssistant hack. It seems to do the basics well though, so overall happy with the state so far.

    3 votes
  5. Comment on AirPano - Lut Desert, Iran in ~arts

    tech-taters
    Link
    Great photos, thanks for sharing! I found the contrast between the barren desert and cyberpunk guy sitting on the roof of a Tacoma with a headset pretty funny.

    Great photos, thanks for sharing! I found the contrast between the barren desert and cyberpunk guy sitting on the roof of a Tacoma with a headset pretty funny.

    2 votes
  6. Comment on What common misunderstanding do you want to clear up? in ~talk

    tech-taters
    Link Parent
    In the spirit of clearing up misconceptions, I’m gonna hit you with a friendly ACTUALLY. It sounds like you’re referring to the whole package that companies like Intel distribute and the general...

    You can also make a case that current-day, multi-core, hyper-threaded CPUs aren't actually "CPUs" at all, since they often integrate graphics processing and split control and arithmetic between cores.

    In the spirit of clearing up misconceptions, I’m gonna hit you with a friendly ACTUALLY.

    It sounds like you’re referring to the whole package that companies like Intel distribute and the general tech-savvy population calls a CPU. Those are better considered SoCs (System on a Chip), since they do all those functions you listed. Maybe I misunderstood you though.

    It’s still correct to call each individual CPU core a CPU, even if it’s not so central anymore. Though the only person that will sell you just a CPU these days is an IP vendor, and that’d just be the design for one.

    Also, AMD is still selling SoCs in the same sense as Intel. Their products generally have CPUs, Memory Controllers, I/O Controllers, etc in a single package (okay, okay, it’s a System on Chiplets). They just leave off the GPU more often than Intel does.

    5 votes
  7. Comment on Looking for music solutions for my car; can anyone recommend a digital audio player? in ~transport

    tech-taters
    Link Parent
    As far as I’ve seen, the only “permission” a Bluetooth receiver request is to sync contacts, for the purposes of using the in-car voice control. Maybe one to make calls as well. If you choose a...

    As far as I’ve seen, the only “permission” a Bluetooth receiver request is to sync contacts, for the purposes of using the in-car voice control. Maybe one to make calls as well.

    If you choose a Bluetooth based music player, it might not even ask for that. If it does, you just deny it. You won’t lose music player functionality.

    To specifically address the Ford case, that can only happen if the car software integrates the Bluetooth receiver. Nobody can snoop your text messages (realistically anyways) with an offline music player.

    Essentially, Bluetooth isn’t the problem there, it’s Ford (assuming you have the story right, I’m not making any assertions about that)

    1 vote
  8. Comment on What common misunderstanding do you want to clear up? in ~talk

    tech-taters
    Link
    CPU does not mean computer. It’s Central Processing Unit. The CPU is a complex digital circuit that is microscopically etched into a silicon crystal. (Shoutout @FlareHeart for clearing that one...

    CPU does not mean computer. It’s Central Processing Unit. The CPU is a complex digital circuit that is microscopically etched into a silicon crystal. (Shoutout @FlareHeart for clearing that one up) The computer is a box with a bunch of parts inside. Or a person/job title if you go back 50ish years.

    Every* computer has at least one CPU, nowadays more like 4-12. A CPUs job is to read instructions and do simple operations like addition, subtraction, and moving data.

    9 votes
  9. Comment on Looking for some video game suggestions based off some specific parameters in ~games

    tech-taters
    Link
    Hear me out, golf. If you’ve got the time and this doesn’t need to be a “while I’m holding a child” game, you might enjoy it. No/skippable story no/limited exploration no/limited unlocks,...

    Hear me out, golf. If you’ve got the time and this doesn’t need to be a “while I’m holding a child” game, you might enjoy it.

    • No/skippable story
    • no/limited exploration
    • no/limited unlocks, no/limited power ups (there is equipment, but there’s no reason to buy fancy clubs if you don’t want to)
    • but high in strategy and/or skill
    • pretty simple while still giving depth to it
    • picking right back off where you were after not playing for months (totally normal for all casual golfers)

    The time commitment can be flexible, especially if you have a 9 hole course nearby, or if your 18 offers a 9 hole price. Twilight golf in the afternoon/evening seasonally is a great way to fit in 9-18 holes for significantly reduced prices.

    All this and you get to go for a walk, and/or drive a golf cart, which is a surprisingly fun minigame

    3 votes
  10. Comment on What is a business/org that's so terrible no one should use if possible? in ~life

    tech-taters
    Link Parent
    Schwab has been pretty great for me. I only use them for my secondary checking+debit card account, plus a brokerage. Not daily banking. But the debit card is absurdly powerful. No foreign...

    Schwab has been pretty great for me. I only use them for my secondary checking+debit card account, plus a brokerage. Not daily banking.

    But the debit card is absurdly powerful. No foreign transaction fees, best daily rate foreign conversion, and all ATM fees fully refunded at the end of the month, no matter what network it’s in.

    Essentially I can walk up to an ATM anywhere on the planet, pull out local currency, and get the daily rate with no fees. Unbeatable for travel. The account is completely free with no minimum balance or deposits required.

    In terms of customer service, I recently goofed and forgot to move more money to that account before arriving in another country. I called customer service from the customs line, and asked them to expedite the 3 day deposit review (in place for fraud prevention). Agent had the deposit go through next morning. A+ agents who are actually enabled to solve problems.

    Considering moving the rest of my banking to them. Would recommend you give it a try. It’s free :)

    6 votes
  11. Comment on Should C be mandatory learning for career developers? in ~comp

    tech-taters
    Link Parent
    I'm not the most knowledgeable about GPUs specifically, but have worked with plenty of peripheral and acceleration devices. GPUs did, and continue to, significantly change computing. They're a...

    GPUs are another huge development that I don't really understand, along with M1 chips. I still think these things are still derivative of the theoretical foundation of computing, though.

    I'm not the most knowledgeable about GPUs specifically, but have worked with plenty of peripheral and acceleration devices. GPUs did, and continue to, significantly change computing. They're a completely different tool than a CPU and solve different classes of problems. Essentially somebody took the Arithmetic Logic Unit from the CPU, duplicated it a thousand times, and gave it some dedicated memory.

    This system allows the main system CPU to move data into the dedicated memory, chunk it into a bunch of small sections, and assign each section to one of those ALUs. If each ALU is able to independently and simultaneously operate on its assigned section of data, then you've just done a math operation on the whole data set in 1/1000th the time it would have taken the system CPU.

    This was relevant to computer graphics 20ish years ago because, it turns out, rendering graphics can be chunked into bite sized jobs pretty well. This is also relevant today, because training and using Machine Learning models is ultimately just a bunch of Linear Algebra math. Linear Algebra/Matrix Math can be simplified to a bunch of Multiply-Accumulate operations, which ALUs can do in bite-sized-chunks!

    along with M1 chips

    If you're wondering about Apple's M-series of chips, I can talk about those too. There wasn't anything truly paradigm-shifitng there, but it did create waves and leveraged some cool ideas that the industry wasn't quite expecting.

    As general background, the M-series chips (and the A-series in iPhones) are Systems-on-a-Chip or SoCs. The idea here is that the entire computer is all on one single* piece of silicon (*caveat for multi-die systems on an interposer or generally sharing a package). Traditional computers have separate devices all connected together. Separate CPU, GPU, Memory, I/O, etc. Putting it all together gets you some speed and power efficiency at the cost of money and complexity.

    That leads nicely into the first cool idea Apple implemented: Unified Memory Architecture. This is what you call a system that shares one pool of Memory/RAM for all devices, particularly the GPU and CPU, and integrates it into the same package. This gets you 2 benefits:

    1. Lower memory latency: Your RAM is physically closer, so you can access it faster. Pretty simple.
    2. CPU and GPU can flex their memory usage: If you have a 16GB memory pool, maybe you want some tasks to divide it 14GB GPU and 2GB CPU. Or vice-versa for a different task. Conventional systems have separate memories that are not shareable.

    The second idea Apple implemented is the ARM Instruction Set Architecture. ARM is nothing new, it's been the default ISA for smartphones since almost the beginning. But Apple was the first to deploy it in a high-power/performance setting (laptops and desktops). Some people thought it couldn't be done, since high-performance chips had almost always been x86 ISA from Intel or AMD.

    To wrap it up, why was this all so exciting? Nobody expected Apple to deploy their own ARM-based, high-performance SoCs and replace Intel's traditional CPUs. Intel had been the only game in town since the early 2000s. Topping it all off, the M1 was extremely powerful AND power efficient. I'm typing this on a M1 Pro MacBook Pro, and this was the first time I had a laptop that could genuinely last a full work day. Never before.

    Now why was this not surprising? The M1 is just a bigger iPhone SoC! Apple has been designing these SoCs, with Unified Memory Architecture, and relatively high-performance ARM ISA CPUs since 2010 and their A4 SoC. The A4 was in the original iPad and the iPhone 4.

    3 votes
  12. Comment on Post graduation job search in ~life

    tech-taters
    Link
    Embedded/Systems software engineer chiming in. I will also suggest giving software engineering a try. As far as career paths go, it's pretty good all around. There are plenty of people who don't...

    Embedded/Systems software engineer chiming in. I will also suggest giving software engineering a try. As far as career paths go, it's pretty good all around. There are plenty of people who don't love it, but can tolerate it and do a good job.

    Most importantly, it is always easier to get into the deep technical work during your early career. You can move to a less technical role later if you don't want to stick with programming all day. It is much harder to go the other direction. People with technical backgrounds can become some of the best Sales Engineers and Project Managers.

    What coursework during your CS program:

    1. Were you good at?
    2. Did you enjoy?

    That info can help us suggest sub-industries worth applying to.

    Some unsolicited career advice as a software engineer. Learn to use the new AI tools, but for the love of god, get and stay comfortable without them. Never trust the computers. dons tinfoil hat

    3 votes
  13. Comment on Happy Gilmore 2 | Official trailer in ~movies

    tech-taters
    Link Parent
    I’m looking forward to this one personally. Feels like a good time to bring Happy back, considering the growth of golf since COVID. They’ve really managed to get almost everyone in golf involved...

    I’m looking forward to this one personally. Feels like a good time to bring Happy back, considering the growth of golf since COVID.

    They’ve really managed to get almost everyone in golf involved too. Maybe it’s through Full Swing, which is also on Netflix?

    1 vote
  14. Comment on Software engineer lost his $150K-a-year job to AI—he’s been rejected from 800 jobs and forced to DoorDash and live in a trailer to make ends meet in ~tech

    tech-taters
    Link Parent
    I agree with the other responses here. Building something to show is always a good idea. I would add that whatever you choose to build, should be ambitious or challenging enough that it gives you...

    I agree with the other responses here. Building something to show is always a good idea. I would add that whatever you choose to build, should be ambitious or challenging enough that it gives you a better idea of the challenges faced by developers in that area.

    This can help you feel one step ahead of the interviewer, and better guess what they want to hear. As an example, no interviewer actually wants to hear about the basic backend API you threw together. They want to hear about how you load balanced huge numbers of requests to those endpoints and got the database to handle concurrent requests (disclaimer, I’m not a backend person).

    I do embedded and operating systems work. When I’m interviewing a candidate, I like to hear how they accidentally deadlocked 2 threads and went about fixing their synchronization primitives. I don’t really care that they wrote X-many lines of C and they blinked some LEDs.

    At first, it may be difficult to get a sense for what the interesting challenges are. For that, you really just have to expose yourself to the subspecialty in more depth. Whether that’s talking to experienced folks, reading, etc. That bit is up to your preferences and situation.

    4 votes
  15. Comment on Ethernet working but not working? At an absolute loss. in ~comp

    tech-taters
    Link
    Try factory resetting the router before swapping in a new test unit. Perhaps it has some retained setting for your PCs MAC address. That could explain why it works when directly connected to the...

    Try factory resetting the router before swapping in a new test unit. Perhaps it has some retained setting for your PCs MAC address. That could explain why it works when directly connected to the modem, and why it works with a dongle.

  16. Comment on Where are the small phones? in ~tech

    tech-taters
    Link Parent
    As I mentioned in my original post, I am 100% on board with small phones as a standalone feature request, mainly because users can’t change their ergonomics. I’m also against e-waste. Most phones...

    As I mentioned in my original post, I am 100% on board with small phones as a standalone feature request, mainly because users can’t change their ergonomics.

    I’m also against e-waste. Most phones can have their batteries replaced professionally, or at home with a bit of hassle. Sure it’s not as convenient, but when you want the device to last 3-7 years, I think a bit of maintenance is reasonable. Manufacturers should provide the tools and service options.

    I will also add that there is absolutely a trade off between battery life, phone size, and the ability to replace it easily.

    I think it is easy to take for granted how much our phones do these days. Past phones may have had replaceable batteries, but they didn’t always have the pretty-good battery life we have now, and they certainly weren’t doing half as much work. That’s all to say, I understand why that feature is dropped by manufacturers.

    1 vote
  17. Comment on Where are the small phones? in ~tech

    tech-taters
    Link Parent
    As I mentioned in my original post, I am 100% on board with small phones as a standalone feature request, mainly because users can’t change their ergonomics.

    As I mentioned in my original post, I am 100% on board with small phones as a standalone feature request, mainly because users can’t change their ergonomics.

    1 vote
  18. Comment on Where are the small phones? in ~tech

    tech-taters
    Link Parent
    Thank you for the detailed rundown. I have some follow up questions. If you’re scattering batteries around like acorns for winter, why not just have a few portable chargers that hold multiple full...

    Thank you for the detailed rundown. I have some follow up questions.

    If you’re scattering batteries around like acorns for winter, why not just have a few portable chargers that hold multiple full charges?

    What prevents you from staying plugged in for a few short stints throughout the day?

    If you’re swapping batteries instead of charging, why not use USB-C headphones or an adapter to your favorite 3.5mm pair?

    These seem like minor hurdles to me, but I understand everyone’s different. Why not make some small workflow changes to get a modern (not to mention secure) device?

    2 votes
  19. Comment on Where are the small phones? in ~tech

    tech-taters
    Link Parent
    In the context of these features, I don’t think it’s as simple as the manufacturer pushing their preferences. I think they are concerned with user preferences, just not those of the users request...

    In the context of these features, I don’t think it’s as simple as the manufacturer pushing their preferences. I think they are concerned with user preferences, just not those of the users request these particular features.

    To be clear, I am not defending any companies and saying they always do right by their users.

    Examples:

    SD Cards are notoriously unreliable and slow. There are faster SD interfaces, but asking the average person to figure out the naming conventions, personal requirements, and sourcing is not realistic. Just put more of the same fast internal storage inside and your parents are good to go.

    Need to move files around? As of the last few years, every flagship has 5-10Gbps of USB transfer speed. If a user knows how a thumb drive works, they can figure this out.

    Removable batteries take up more internal space than ones more tightly integrated. That’s a challenge for small form factor devices.

    I haven’t used a phone with a removable battery in a while, but unless they’ve started bolting the enclosure shut, the plastic clips are flimsy and pop open when dropped. I certainly don’t miss my battery skidding across the parking lot.

    Water and dust resistance is also much improved without a removable battery. It’s certainly possible to have both, but it comes with tradeoffs. Most phones can have the battery professionally replaced these days, and retain IPx ratings. Inconvenient sure, but some minor maintenance every few years in order to keep your phone running for nearly 10 years seems reasonable, for those that want to do that.

    IR blasters are always described in a novelty use case. It’s just not necessary, but I will admit it does have whimsy and was fun. All about trade offs.

    Most people really like enjoy wireless headphones, and USB-C headphones/DACs/AMPs all exist. If you really have to charge at the same time and never take your headphones off, there’s splitters and wireless chargers for cheap.

    All of these features were once somewhat common, if not standard, and went away because most people don’t know what an SD card is.

    7 votes
  20. Comment on Where are the small phones? in ~tech

    tech-taters
    Link Parent
    It sounds like we understand each other. No mainstream phone is ever going to have close to the complete list. I almost made a joke in my original post that the only people who’d buy such a phone...

    It sounds like we understand each other. No mainstream phone is ever going to have close to the complete list.

    I almost made a joke in my original post that the only people who’d buy such a phone are Arch Linux users. I was not aware that “Linux Phones” were a thing, but that sounds like the only way a phone could get anywhere near this feature list.

    Open source hardware baby. If someone wants an IR blaster in the Linux phone, they better submit a merge request.

    6 votes