tech-taters's recent activity
-
Comment on Should C be mandatory learning for career developers? in ~comp
-
Comment on Post graduation job search in ~life
tech-taters Embedded/Systems software engineer chiming in. I will also suggest giving software engineering a try. As far as career paths go, it's pretty good all around. There are plenty of people who don't...Embedded/Systems software engineer chiming in. I will also suggest giving software engineering a try. As far as career paths go, it's pretty good all around. There are plenty of people who don't love it, but can tolerate it and do a good job.
Most importantly, it is always easier to get into the deep technical work during your early career. You can move to a less technical role later if you don't want to stick with programming all day. It is much harder to go the other direction. People with technical backgrounds can become some of the best Sales Engineers and Project Managers.
What coursework during your CS program:
- Were you good at?
- Did you enjoy?
That info can help us suggest sub-industries worth applying to.
Some unsolicited career advice as a software engineer. Learn to use the new AI tools, but for the love of god, get and stay comfortable without them. Never trust the computers. dons tinfoil hat
-
Comment on Happy Gilmore 2 | Official trailer in ~movies
tech-taters I’m looking forward to this one personally. Feels like a good time to bring Happy back, considering the growth of golf since COVID. They’ve really managed to get almost everyone in golf involved...I’m looking forward to this one personally. Feels like a good time to bring Happy back, considering the growth of golf since COVID.
They’ve really managed to get almost everyone in golf involved too. Maybe it’s through Full Swing, which is also on Netflix?
-
Comment on Software engineer lost his $150K-a-year job to AI—he’s been rejected from 800 jobs and forced to DoorDash and live in a trailer to make ends meet in ~tech
tech-taters I agree with the other responses here. Building something to show is always a good idea. I would add that whatever you choose to build, should be ambitious or challenging enough that it gives you...I agree with the other responses here. Building something to show is always a good idea. I would add that whatever you choose to build, should be ambitious or challenging enough that it gives you a better idea of the challenges faced by developers in that area.
This can help you feel one step ahead of the interviewer, and better guess what they want to hear. As an example, no interviewer actually wants to hear about the basic backend API you threw together. They want to hear about how you load balanced huge numbers of requests to those endpoints and got the database to handle concurrent requests (disclaimer, I’m not a backend person).
I do embedded and operating systems work. When I’m interviewing a candidate, I like to hear how they accidentally deadlocked 2 threads and went about fixing their synchronization primitives. I don’t really care that they wrote X-many lines of C and they blinked some LEDs.
At first, it may be difficult to get a sense for what the interesting challenges are. For that, you really just have to expose yourself to the subspecialty in more depth. Whether that’s talking to experienced folks, reading, etc. That bit is up to your preferences and situation.
-
Comment on Ethernet working but not working? At an absolute loss. in ~comp
tech-taters Try factory resetting the router before swapping in a new test unit. Perhaps it has some retained setting for your PCs MAC address. That could explain why it works when directly connected to the...Try factory resetting the router before swapping in a new test unit. Perhaps it has some retained setting for your PCs MAC address. That could explain why it works when directly connected to the modem, and why it works with a dongle.
-
Comment on Where are the small phones? in ~tech
tech-taters As I mentioned in my original post, I am 100% on board with small phones as a standalone feature request, mainly because users can’t change their ergonomics. I’m also against e-waste. Most phones...As I mentioned in my original post, I am 100% on board with small phones as a standalone feature request, mainly because users can’t change their ergonomics.
I’m also against e-waste. Most phones can have their batteries replaced professionally, or at home with a bit of hassle. Sure it’s not as convenient, but when you want the device to last 3-7 years, I think a bit of maintenance is reasonable. Manufacturers should provide the tools and service options.
I will also add that there is absolutely a trade off between battery life, phone size, and the ability to replace it easily.
I think it is easy to take for granted how much our phones do these days. Past phones may have had replaceable batteries, but they didn’t always have the pretty-good battery life we have now, and they certainly weren’t doing half as much work. That’s all to say, I understand why that feature is dropped by manufacturers.
-
Comment on Where are the small phones? in ~tech
tech-taters As I mentioned in my original post, I am 100% on board with small phones as a standalone feature request, mainly because users can’t change their ergonomics.As I mentioned in my original post, I am 100% on board with small phones as a standalone feature request, mainly because users can’t change their ergonomics.
-
Comment on Where are the small phones? in ~tech
tech-taters Thank you for the detailed rundown. I have some follow up questions. If you’re scattering batteries around like acorns for winter, why not just have a few portable chargers that hold multiple full...Thank you for the detailed rundown. I have some follow up questions.
If you’re scattering batteries around like acorns for winter, why not just have a few portable chargers that hold multiple full charges?
What prevents you from staying plugged in for a few short stints throughout the day?
If you’re swapping batteries instead of charging, why not use USB-C headphones or an adapter to your favorite 3.5mm pair?
These seem like minor hurdles to me, but I understand everyone’s different. Why not make some small workflow changes to get a modern (not to mention secure) device?
-
Comment on Where are the small phones? in ~tech
tech-taters In the context of these features, I don’t think it’s as simple as the manufacturer pushing their preferences. I think they are concerned with user preferences, just not those of the users request...In the context of these features, I don’t think it’s as simple as the manufacturer pushing their preferences. I think they are concerned with user preferences, just not those of the users request these particular features.
To be clear, I am not defending any companies and saying they always do right by their users.
Examples:
SD Cards are notoriously unreliable and slow. There are faster SD interfaces, but asking the average person to figure out the naming conventions, personal requirements, and sourcing is not realistic. Just put more of the same fast internal storage inside and your parents are good to go.
Need to move files around? As of the last few years, every flagship has 5-10Gbps of USB transfer speed. If a user knows how a thumb drive works, they can figure this out.
Removable batteries take up more internal space than ones more tightly integrated. That’s a challenge for small form factor devices.
I haven’t used a phone with a removable battery in a while, but unless they’ve started bolting the enclosure shut, the plastic clips are flimsy and pop open when dropped. I certainly don’t miss my battery skidding across the parking lot.
Water and dust resistance is also much improved without a removable battery. It’s certainly possible to have both, but it comes with tradeoffs. Most phones can have the battery professionally replaced these days, and retain IPx ratings. Inconvenient sure, but some minor maintenance every few years in order to keep your phone running for nearly 10 years seems reasonable, for those that want to do that.
IR blasters are always described in a novelty use case. It’s just not necessary, but I will admit it does have whimsy and was fun. All about trade offs.
Most people really like enjoy wireless headphones, and USB-C headphones/DACs/AMPs all exist. If you really have to charge at the same time and never take your headphones off, there’s splitters and wireless chargers for cheap.
All of these features were once somewhat common, if not standard, and went away because most people don’t know what an SD card is.
-
Comment on Where are the small phones? in ~tech
tech-taters It sounds like we understand each other. No mainstream phone is ever going to have close to the complete list. I almost made a joke in my original post that the only people who’d buy such a phone...It sounds like we understand each other. No mainstream phone is ever going to have close to the complete list.
I almost made a joke in my original post that the only people who’d buy such a phone are Arch Linux users. I was not aware that “Linux Phones” were a thing, but that sounds like the only way a phone could get anywhere near this feature list.
Open source hardware baby. If someone wants an IR blaster in the Linux phone, they better submit a merge request.
-
Comment on Where are the small phones? in ~tech
tech-taters Agreed that big numbers are easier to sell and likely has influence on product direction. However as someone involved on the silicon side of things, I will argue that the process node changes are...Agreed that big numbers are easier to sell and likely has influence on product direction.
However as someone involved on the silicon side of things, I will argue that the process node changes are a massive technology point. The numbers are meaningless now, but the technology is not.
Whether that should be a selling point to consumers is up for debate. It certainly does signal important information to the tech world and investors.
-
Comment on Where are the small phones? in ~tech
tech-taters I’m going to poke the bear here with good intentions. I don’t mean to be rude or accusatory with anything I say below: Every time this discussion (and similar ones) about phones features comes up,...I’m going to poke the bear here with good intentions. I don’t mean to be rude or accusatory with anything I say below:
Every time this discussion (and similar ones) about phones features comes up, I have the same thoughts. This vocal minority seems to always ask for the same list of features:
Headphone jack
Removable battery
Long-lasting battery
No camera bump
SD cards
Small size
IR blaster
Oh, and it has to be cheap tooNo manufacturer in their right mind is going to make this phone because the only people who would buy it are the 20 people who listen to FLAC music offline for 12-16 hours per day, and have $2000 to spend on a device.
There are certainly phones with a subset of those features. I can’t help feeling like the people asking for this are resistant to change.
To be clear, I get wanting a small phone. Ergonomics are the one thing you can’t adjust your workflow to accommodate.
If this is your dream phone, tell me about your use case, and what your current setup is.
-
Comment on What is a misconception you are passionate about and would like to clarify? in ~talk
tech-taters Your comment has big: energy and I'm greatly enjoying it. Proves my point that USB is a mess :) Thanks for expanding on the connectors. I am a USB-B stan, despite excluding it. I appreciate the...Your comment has big:
"What you’re referring to as Linux, is in fact, GNU/Linux"
energy and I'm greatly enjoying it. Proves my point that USB is a mess :)
Thanks for expanding on the connectors. I am a USB-B stan, despite excluding it. I appreciate the heft and the clear orientation for when you're bent over behind the printer.
USB-C is pretty great, all things considered, but the receptacle is too small for some cases, in my humble opinion.
-
Comment on What is a misconception you are passionate about and would like to clarify? in ~talk
tech-taters USB naming has become quite a mess, and it makes it difficult for consumers to understand what's going on. First of all, there is USB the connector, and USB the communication protocol. Connectors...USB naming has become quite a mess, and it makes it difficult for consumers to understand what's going on.
First of all, there is USB the connector, and USB the communication protocol.
- Connectors
- USB-A: The one most people think of. Rectangular, non-reversible, flash-drives, mouse, keyboard, etc
- Micro-USB: "The old Android charger"
- USB-C: The one basically all new phones use. Reversible and oval-shaped.
Historically, these mainly carried the USB 1/2/3 protocols, at varying speeds, with their own mess of naming. These names included:
- USB 1.0/2.0 (Low Speed, Full Speed, and High Speed)
- USB 3.0 (AKA SuperSpeed)
- USB 3.1 (AKA SuperSpeed+)
- USB 3.2 (AKA USB 3.2 Gen 2x2)
See how the advertising names go off the rails? They've since retconned this and somewhat improved the naming, but the damage is done on 10+ year old protocols.
As an additional twist, any one of the above Connectors could carry any of the above Protocols and Speeds (There may be one or two combinations that were not possible, but the vast majority were possible).
The protocols also had different requirements for the construction of USB cables. Since cables are often unmarked or poorly manufactured, it is common for users to have a lower-speed cable and leave performance on the table. You can even have charging-only USB cables that completely omit data wires. This is probably what you're getting from a gas-station USB cable.
Enter, Thunderbolt and USB 4. Thunderbolt and USB 4 are fundamentally different protocols from those discussed above, but still use USB-C connectors. Both protocols are able to carry the above USB 2/3 protocols, in addition to others such as DisplayPort and PCIe.
Thunderbolt and USB 4 cables have much higher quality requirements to accommodate ridiculously high speeds. These high speeds also mean that cables have length limits, typically less than 2 meters. Cables can be longer, but require special built-in hardware to boost and retransmit the signal over the longer distance. So next time you see a PC vendor selling a $50 "USB-C to USB-C" cable, it may not be a (total) rip-off. Check the specs.
The groups in charge of USB seem to know that naming is an issue, but they are still making some silly naming decisions. We can hope for a brighter future.
- Connectors
I'm not the most knowledgeable about GPUs specifically, but have worked with plenty of peripheral and acceleration devices. GPUs did, and continue to, significantly change computing. They're a completely different tool than a CPU and solve different classes of problems. Essentially somebody took the Arithmetic Logic Unit from the CPU, duplicated it a thousand times, and gave it some dedicated memory.
This system allows the main system CPU to move data into the dedicated memory, chunk it into a bunch of small sections, and assign each section to one of those ALUs. If each ALU is able to independently and simultaneously operate on its assigned section of data, then you've just done a math operation on the whole data set in 1/1000th the time it would have taken the system CPU.
This was relevant to computer graphics 20ish years ago because, it turns out, rendering graphics can be chunked into bite sized jobs pretty well. This is also relevant today, because training and using Machine Learning models is ultimately just a bunch of Linear Algebra math. Linear Algebra/Matrix Math can be simplified to a bunch of Multiply-Accumulate operations, which ALUs can do in bite-sized-chunks!
If you're wondering about Apple's M-series of chips, I can talk about those too. There wasn't anything truly paradigm-shifitng there, but it did create waves and leveraged some cool ideas that the industry wasn't quite expecting.
As general background, the M-series chips (and the A-series in iPhones) are Systems-on-a-Chip or SoCs. The idea here is that the entire computer is all on one single* piece of silicon (*caveat for multi-die systems on an interposer or generally sharing a package). Traditional computers have separate devices all connected together. Separate CPU, GPU, Memory, I/O, etc. Putting it all together gets you some speed and power efficiency at the cost of money and complexity.
That leads nicely into the first cool idea Apple implemented: Unified Memory Architecture. This is what you call a system that shares one pool of Memory/RAM for all devices, particularly the GPU and CPU, and integrates it into the same package. This gets you 2 benefits:
The second idea Apple implemented is the ARM Instruction Set Architecture. ARM is nothing new, it's been the default ISA for smartphones since almost the beginning. But Apple was the first to deploy it in a high-power/performance setting (laptops and desktops). Some people thought it couldn't be done, since high-performance chips had almost always been x86 ISA from Intel or AMD.
To wrap it up, why was this all so exciting? Nobody expected Apple to deploy their own ARM-based, high-performance SoCs and replace Intel's traditional CPUs. Intel had been the only game in town since the early 2000s. Topping it all off, the M1 was extremely powerful AND power efficient. I'm typing this on a M1 Pro MacBook Pro, and this was the first time I had a laptop that could genuinely last a full work day. Never before.
Now why was this not surprising? The M1 is just a bigger iPhone SoC! Apple has been designing these SoCs, with Unified Memory Architecture, and relatively high-performance ARM ISA CPUs since 2010 and their A4 SoC. The A4 was in the original iPad and the iPhone 4.