Well, if you can do it, do it. But in my experience, using an analog computer is nothing at all like digital. I used to have to maintain one when I worked at the University of London, back in the very early 80s (basically making sure plug-board wires hadn't gone bad). Programming one (if you can call it that) required a bit of mathematical nous (which I didn't have enough of, though I was pretty sharp at digital), and the academic I worked with (who did) used to spend a lot of time saying "f*ck" as he tried to set up things like predator-prey simulation demos for the students.
The author is not talking about using an "analog computer"; they are talking about designing analog circuitry:
> Among them are IoT nodes, sensors and front-ends, actuators including motors, test and measurement instrumentation, boards and interconnects, line drivers/receivers, motor drivers, physical-level wired/wireless links, MEMS devices, RF functions, power sources and systems, EMI, noise, thermal extremes…and that’s just a partial list. These roles and functions are not going away as long as the laws of physics, as we know them, remain in place.
(I also think it's misleading to use the term "computer" for things like differential analyzers, just as it's misleading to call a person who adds up numbers a "computer", even though both usages were well established before the invention of the inherently digital devices we call "computers" today. But that's a different discussion.)
I think the real point of the article is a job of occupation (even just writing code to support analog design work), not necessarily going all the way back to the wonders of using analog computers.
But that's a very cool story.. do you remember which model of an analog computer that was?
Linn? They used to make electronic drum kits and briefly dabbled in computer design. Byte Magazine (I think) had a cover story on them but as I recall it, their system was object oriented, not analog.
It seems as though analog doesn't mean what it used to mean. It seems as though it is a stand-in for physical these days. The physical thing you make may be analog, yet it could very well be digital. The important thing is the product is physical, rather than being a bundle of bits that you ship to someone else who takes care of the hardware it runs on. The tone of the article leads me to think that this is what the author is talking about.
The author is explicitly talking about designing analog electronics:
> Among them are IoT nodes, sensors and front-ends, actuators including motors, test and measurement instrumentation, boards and interconnects, line drivers/receivers, motor drivers, physical-level wired/wireless links, MEMS devices, RF functions, power sources and systems, EMI, noise, thermal extremes…and that’s just a partial list. These roles and functions are not going away as long as the laws of physics, as we know them, remain in place.
If you want to get your feet wet with analog electronics, I'd suggest getting an Arduino starter kit, with a breadboard, some components, and a cheap Multimeter and Oscilloscope, and just start playing with things. You can fairly quickly build up an intuition of things if you've got something you can get your hands on.
Once you get the hang of the basics at Audio and low RF frequencies, you can then set up GNU Radio, which works with your audio I/O of your computer, etc.. maybe add a $30 RTLsdr dongle, and the next thing you know, you've got a bit of RF under your belt.
I respect those that have come before and I know there still exists some places where analog is not only the superior option but the only option however almost everything you want to do is ADC then back DAC.
- As you note, signal conditioning to stuff things into an ADC
- Anywhere firmware is viewed as a liability (often medical or other hi-rel stuff)
- Existing proven designs (do not underestimate this sector!)
- Anywhere the cost of the signal conditioning circuitry might be comparable to the cost of just doing it outright in analog. This is mostly the low-cost realm, more rarely ultra-low-power, but sometimes you see it in other places too.
- Line power supplies happen to be all of the above, so you see plenty of analog there
You used to see analog in high-performance stuff (ultra-high-speed/RF or ultra-high-bit-depth), but this has mostly gone into either digital or whackadoodle exotica. Like frontends for those 100GHz oscilloscopes, which are both!
most of analog design nowadays is getting the sign to the ADC, the ADC itself, and the clock generation for the ADC. The ADC is by far the most complex subsystem and it's partially digital partially analog, though the analog parts are also quite algorithmic (binary search, linear search, majority voting, LMS).
The reverse path (DAC) is less common, like 10% of the cases you need a good DAC for signal generation. It's more hardcore analog and harder to design a DAC.
IMO, the biggest reason America lags so far behind in many engineering fields is the success of our SW industry. It's not just EEs. Our SW industry is so successful and so lucrative that it starves other fields of bodies.
Another couple of ways to get started with analog signal processing:
- Build an AM radio from transistors. There are lots of tutorials out there.
- Simulate circuits with Falstad's circuit.js. There are some interesting analog circuits already in the set of examples, like https://tinyurl.com/24gccg7p.
- Build an Atari Punk.
You can get very, very good op-amps very cheaply these days. Some of them even still come in through-hole packages. This makes it possible to build interesting audio synthesizer circuits for pennies that would have required significant money outlay in the 70s.
That's been my pastime away from the digital screens of my day job. But I didn't go full modular, instead getting 3 analog monosynths. I need some structure while learning music at the same time and can't venture into the wild world of modular. My gear has lots of CV/Gate ins/outs for that later step though.
It's a matter of representation. Do the signals represent continuous or discrete quantities? A digital signal represents a discrete quantity such as an integer or symbol, or a sequence of those quantities. Digital systems possess the feature of "noise immunity," where a signal can be unambiguously interpreted due to rules that involve thresholds. For instance you can look up an oscilloscope trace of the signals on a USB or Ethernet cable, and they look horrid, but those signals can transmit information with virtually zero error.
To expand a bit, since my day job involves this stuff, physical stimuli are always analog. Even the discrete energy levels of an atom make their transitions in continuous time. Yet there are good reasons to do virtually all computation in the digital domain, where "noise immunity" allows processing to occur without the introduction of additional noise, and you enjoy all of the other benefits of computer programming.
These days, the job of the analog person is often to understand the physics of the quantity being measured, and the sensor, but to get a signal safely to the front end of an analog-to-digital converter.
Now, the irony is that I actually spend most of my time working in the digital domain. The reason is that analysis of the digital data stream is how I know that my analog stuff is working, and how I optimize the overall system. So if you watched me work for a week, you'd notice that I actually spend a fair portion of my time coding. I just don't write software for other people to use. That's another department, and their work usually starts after mine is done.
The author is specifically talking about designing analog electronic circuits:
> Among them are IoT nodes, sensors and front-ends, actuators including motors, test and measurement instrumentation, boards and interconnects, line drivers/receivers, motor drivers, physical-level wired/wireless links, MEMS devices, RF functions, power sources and systems, EMI, noise, thermal extremes…and that’s just a partial list. These roles and functions are not going away as long as the laws of physics, as we know them, remain in place.
Woodworking can be analog if the wood shapes and positions (and maybe velocities, etc.) are used to quantitatively represent something other than the wood itself, as an analogue to those quantities. For example, you can carve some wooden cams to drive a little automaton, or you can make a clock out of wood gears, where the angles of rotation of the gears represent the amount of time that has passed. But this article is specifically about electronics.
>>> Someone with programming experience could contribute in many of these areas, and still work exclusively at their keyboards and not even getting their hands dirty, if that’s their concern.
I could probably be described as living in the "analog" domain, as a physicist working for a company that makes measurement equipment. Naturally, this could be an ingrained bias, but I've formed the impression that something about getting your hands dirty confers the intuition needed to work productively in this domain. You need to experience being proven wrong by mother nature, over and over again.
Also, if you're sitting at your screen all day, nobody's going to pull you into the loop. It's quicker to just do that stuff ourselves, than to explain it to someone.
So I agree with everything else in the article, because I love analog and love coding. But come on, join us in the lab.
I've got a relative who works at Analog Devices. They're on their third straight month of crunch time, working in 12 hour shifts through the weekends.
Why? Because the dipshits in leadership decided to project the revenue growth during the chip shortage as a straight line for the next 10 years.
Looks like those same dipshits decided the best course of action is to get their soft skulled alumni to write some blog posts to try to herd more cattle into the grinder.
As someone in the EE field. The jobs exist but are not plentiful. The physical engineering fields in the US have very largely shrunk due to offshoring, centralization into major OEMs and general efficiencies in doing work. "Analog" is a very cost sensitive and optimized arena.
Are there decent paying jobs now? I was an EE then CompE major and, while I enjoyed it and think my engineering degrees were solid, all the jobs and better pay were in software. I'm glad I liked to program as a hobby, otherwise my career would have been a lot more challenging.
No. This is a serious problem. The traditional engineers (EE, MechE) don't program, because the ones who do, have gone into software development. The people who remain are freaks like me, who for some reason, weren't interested in becoming software developers. Many of us have a physical science background.
My hypothesis is that if you can produce a ten-million-dollar product using a million dollars of lab equipment, the ten million dollars are probably going to go to the guy who bought the lab equipment, while if you can produce a ten-million-dollar product using your laptop, you can probably keep the dollars.
Indeed, I've given up trying to analyze the market, especially because I'm late-career and at a "staff plus" level. I think to some extent, programmers have a certain exclusivity because only a certain fraction of people can learn to program -- for reasons we don't understand -- and they have a high degree of mobility. This allows programming to function in a fashion akin to a guild, with collective awareness and action about wages and working conditions. Also, I think that open-source tooling plays a role. Ironically, "the workers own the means of production."
As someone in a similar position, I have a different take. I think SWE is compensated more highly for different reasons. I disagree that programming is any more difficult to learn than EE; I think it's pretty clear that they're broadly on the same tier and require similar (but not identical) skill sets. I think that only a small fraction of people are well suited to either.
The biggest difference is that software is just closer to "the product". When something goes wrong on a typical embedded device, the first thing they say isn't "well, better bring in exmadscientist to redesign the board", it's "let's fix this in firmware". And so on and so forth. Most electrical projects just aren't, directly, the project. There's a big layer of software in between, and that software becomes the face of the product or even business and captures so much of the mindshare.
The other big reason is simply that the complexity of software is unbounded. For me, there are only so many parts I can stuff onto a board. But software has no such limits. I was just looking at 2.5GbE Ethernet switch chips for a hobby project -- a hobby project -- and concluded that they weren't bad at all and would take me somewhere around 40 hours to deal with, start to finish. That's for one of the nastier things around in your typical consumer environment (short of a 5GHz CPU) and represents a tremendous amount of investment on so many levels to get things to be not only possible, but down to the level that a seniorish engineer can just do it that quickly.
In contrast, a dinky web interface to manage the stupid thing would also probably be around 40 hours (less if crappy, way more if done "to modern standards"). Which is kind of insane, when you think about how many Gbps SERDES links are in each project! But software can do whatever it feels like, while the hardware side necessarily has constraints on it, so this is what we get.
However, I also think things are pretty imbalanced right now: SWE is somewhat overpaid, and will correct down over time. So it's not a great career to leverage the farm against, though I'd say it's never going to pay worse than EE or ME.
It sounds like you're advocating an alternative explanation to the one I gave, but without understanding that it's different. This probably means that my explanation wasn't very clear.
The workers owning the means of production doesn't seem ironic at all to me in this case; it's textbook Marxist economic theory. The means of production for software are, from Marx's point of view, your laptop and maybe a colo box. When you own (or rent!) them, you aren't alienated from them (in the purely objective sense of https://en.wikipedia.org/wiki/Marx%27s_theory_of_alienation#... rather than any emotional sense) and consequently you own the product. By contrast, when you depend on your employer for access to the means of production, they own the product of your labor.
Open-source tooling, based on the idea that "software should not have owners" (https://www.gnu.org/philosophy/why-free.html), enables every worker to own not only their own computer but their own compiler, linker, etc., rather than depending on angel investors to invest the capital necessary to license them from Computer Associates. You are precisely correct there.
Where my analysis departs from orthodox Marxism (perhaps because I am in fact a liberal) is that I see this as a question of bargaining power rather than a Hegelian mystical thing.
Civil engineers have very little bargaining power because the state power necessary to build a highway is very scarce indeed, while the expertise required to design it is relatively abundant.
Electrical engineers have more bargaining power because they can choose between many potential employers, who have to compete with one another for their relatively scarce skills, but bringing a new electronic product to market still requires a significant investment and several months, if not a year or more. For more advanced electronics like submillimeter ASICs, we're probably talking about several years and tens of millions of dollars.
Programmers have enormous bargaining power because they can bring a salable product to market over the weekend with an investment of a few hundred dollars. Or a few thousand if they're targeting the iPhone.
So I don't think collective awareness and action are what's going on here. I think individual programmers are in a better bargaining position than individual electrical engineers, and consequently they get better bargains individually, without functioning in a fashion akin to a guild. (And collectively owning the means of production, as orthodox Marxism prescribes, would not provide the same benefits. Observably it did not, neither in the Soviet Union and Mao's China nor when US retirees owned the majority of the stock market through their pension funds.)
Incidentally, there have been a lot of innovations over the past 25 years or so that have greatly dropped the investment required to bring a new web service online. Sourceforge and later GitHub and then GitLab eliminated the need to spend a week configuring a server to support a software team. Rackspace and then Amazon Web Services eliminated the need to buy the server, haul it over to the colo, and maybe commit to a service contract. MySQL (now MariaDB), Postgres, and SQLite eliminated the need to license Oracle. (Linux had already eliminated the need to buy a Solaris license 25 years ago, but a lot of people hadn't noticed yet.)
Companies like JLCPCB, PCBWay, and OSHPark seem like they're sort of trying to do the same thing with PCB products, FPGAs and especially Yosys are doing the same thing with digital designs, and companies like Skywater, (the nonprofit) IHP, and Matt Venn (Tiny Tapeout) seem to be trying to do the same thing with ASICs, including mixed-signal ASICs. (I'd list Efabless here, but they seem to have gone out of business last week.)
But being able to pay US$10 for one-week turnaround on a stack of prototype PCBs isn't going to replace a 1GHz LeCroy oscilloscope or a Kuka industrial robot, so I'm not confident that they'll have the same effect.
There's a lot of interesting points that, but you've totally bungled most of the things relating to Marxism, confusing the subject of alienation work it's main mechanism, and confusing Leninism/Maoism—a line of revisions of Marxist theory to attempt to provide a roadmap that bypasses private capitalist development—with “orthodox Marxism”.
Also, orthodox Marxism sees material relation to the capitalist economy as something of a continuum—the three main classes consist of variations of degree of relative importance of two (labor and capital) means of interacting with the economy—the capital-dominant group is the haut bourgeoisie, the labor-dominant group is the proletariat, and the middle class, the petit bourgeoisie, has significant dependence on both labor and capital (the textbook case being someone who applies their own labor, rather than rented labor, to their own capital to produce goods or services, though there are other mixes possible that are also petit bourgeois.) It is purely material, not mystical.
I appreciate the correction! Could you be more specific? I'm not sure what I've gotten wrong exactly, and I'd like to stop getting it wrong. Of course I understand you can't pack Kapital into an HN comment.
That's a nice analysis, and I agree with you about Marx. I'm a liberal too.
Oddly enough I have a side-business that makes an electronic gadget -- think something like a guitar pedal. But I have to choose my battles very carefully to avoid needing any kind of capital investment to speak of, and the real barrier to entry is the knowledge that I've gained from being immersed in the industry. My entire physical capital is less than what an engineer's employer pays per year for a seat of a CAD package.
A very big barrier that software developers don't face is regulatory approval.
It’ll never happen. Analog circuit design takes a level of intelligence that, frankly, most programmers are nowhere near. Digital is black and white by the book and you can trial by error your way to a solution. Analog is an art that actual requires deep knowledge of conflicting parameters. Technically EVERYTHING is analog—there is no such thing as digital—-but coders just don’t have the genetics for analog.
> Analog circuit design takes a level of intelligence that, frankly, most programmers are nowhere near.
I'm skeptical of this comment, and of possible bias given your username.
I'd think roughly the same amount of people that were able to learn to code at more than just the hello world/vb macro level could learn analog circuit design. It's just that the interest isn't there.
Because I know nothing about analog circuits, I find your comment very interesting. Can you give examples of conflicting parameters? And why those parameters can't be simulated away?
Well, let's say your op-amp is too slow. You can get a faster op-amp, but it uses more power and has a higher input bias current. Another op-amp is just as fast and has a thousand times smaller input bias current, because it has JFET inputs, but as a result it has a much higher input offset voltage. Also it doesn't have rail-to-rail inputs, and you need rail-to-rail inputs. Another alternative has a lower offset voltage again (but also larger input bias current, but you resolve to lower the output impedance of the thing that's feeding it so that the bias current isn't a problem) and does have rail-to-rail inputs, but three weeks later you find out it has terrible crossover distortion when the input transitions from using the npn darlington stage it uses for inputs above the positive rail to the pnp darlington stage it uses for inputs below the negative rail, and you curse yourself for not paying more attention to what the old guys were saying. Also it has a lot more noise than the JFET-input op-amp you were thinking about before. And so on.
Basically with a digital circuit you mostly only care about two things: whether it computes the function you want to compute, and how fast it is. Circuits that don't compute what you want to compute can simply be ruled out, and among the circuits that work, the faster the better.† Digital circuits don't have input bias currents, or rather their input bias currents don't introduce error. They don't have dropout voltages or non-rail-to-rail inputs or offset voltages or power supply rejection ratios or noise figures. Either they compute the right answer or they don't.
But in analog design, nothing computes the exactly right answer. Every component introduces errors of different kinds in varying amounts. So, there are a lot of different desirable parameters, everything trades off against everything else, and which parameters matter most depends on the situation. If you're designing a circuit that gets used in a lot of different situations, like a new op-amp IC, you have to kind of guess which of those situations are the most important ones.
I don't think it's true that analog circuit design is harder than programming. Like cooking, how hard it is depends on what you're doing. In all three cases you have problems of a whole range of difficulty from "trivial even for a beginner" to "beyond human capability", and, for more difficult problems, deep knowledge can diminish the amount of trial and error required but never eliminate it.
______
† This is kind of a lie. Slew rates that are faster than you need can cause ground bounce and impact your EMC, but those are analog phenomena and usually of only peripheral interest. Power consumption, another analog phenomenon present in digital circuits, is always a concern if you're on battery, though much more so for analog designs. If you're doing asynchronous logic design, you have to worry about glitches, so faster isn't always better, but almost nobody does asynchronous these days because synchronous logic is so much easier and almost always adequate. Finally, cost trades off against other desirable attributes in any kind of engineering, even digital circuit design. Still, it's a lie that's more true than false.
Dallas County Community College has a career education program (eg like a trade school) in a field called “mechatronics.”
When the presenter explained, it turns out to be programming and managing the systems that do warehouse / product movement in facilities owned by scrappy little companies like Wal-Mart and Amazon…you know, because humans need bathroom breaks and pesky things like safety considerations. Apparently graduates walk into the field regularly getting $70-80,000 a year jobs, which to me sounds really low. Then again, the program is like 18 weeks and a surrogate for higher education in a field where demand exists.
So in a way the grandpa who wrote this article is right, but little does he know it’s eliminating low skill jobs that his meth addled nephew might be actually qualified to do!
Well, if you can do it, do it. But in my experience, using an analog computer is nothing at all like digital. I used to have to maintain one when I worked at the University of London, back in the very early 80s (basically making sure plug-board wires hadn't gone bad). Programming one (if you can call it that) required a bit of mathematical nous (which I didn't have enough of, though I was pretty sharp at digital), and the academic I worked with (who did) used to spend a lot of time saying "f*ck" as he tried to set up things like predator-prey simulation demos for the students.
The author is not talking about using an "analog computer"; they are talking about designing analog circuitry:
> Among them are IoT nodes, sensors and front-ends, actuators including motors, test and measurement instrumentation, boards and interconnects, line drivers/receivers, motor drivers, physical-level wired/wireless links, MEMS devices, RF functions, power sources and systems, EMI, noise, thermal extremes…and that’s just a partial list. These roles and functions are not going away as long as the laws of physics, as we know them, remain in place.
(I also think it's misleading to use the term "computer" for things like differential analyzers, just as it's misleading to call a person who adds up numbers a "computer", even though both usages were well established before the invention of the inherently digital devices we call "computers" today. But that's a different discussion.)
I think the real point of the article is a job of occupation (even just writing code to support analog design work), not necessarily going all the way back to the wonders of using analog computers.
But that's a very cool story.. do you remember which model of an analog computer that was?
Can't remember, I'm afraid. Some obscure British company I guess. Probably one that made music synths back then, as it's the same sort of tech.
Linn? They used to make electronic drum kits and briefly dabbled in computer design. Byte Magazine (I think) had a cover story on them but as I recall it, their system was object oriented, not analog.
No, I don't think Linn, as I would have actually heard of them.
But if you told me the actual name, I really would not recognise it - so long ago.
It seems as though analog doesn't mean what it used to mean. It seems as though it is a stand-in for physical these days. The physical thing you make may be analog, yet it could very well be digital. The important thing is the product is physical, rather than being a bundle of bits that you ship to someone else who takes care of the hardware it runs on. The tone of the article leads me to think that this is what the author is talking about.
The author is explicitly talking about designing analog electronics:
> Among them are IoT nodes, sensors and front-ends, actuators including motors, test and measurement instrumentation, boards and interconnects, line drivers/receivers, motor drivers, physical-level wired/wireless links, MEMS devices, RF functions, power sources and systems, EMI, noise, thermal extremes…and that’s just a partial list. These roles and functions are not going away as long as the laws of physics, as we know them, remain in place.
did you.. read the article?
yes, i did. i thought it was not much good. these days i really only post anecdotes here, as i don't do any dev work now. humour me!
If you want to get your feet wet with analog electronics, I'd suggest getting an Arduino starter kit, with a breadboard, some components, and a cheap Multimeter and Oscilloscope, and just start playing with things. You can fairly quickly build up an intuition of things if you've got something you can get your hands on.
Once you get the hang of the basics at Audio and low RF frequencies, you can then set up GNU Radio, which works with your audio I/O of your computer, etc.. maybe add a $30 RTLsdr dongle, and the next thing you know, you've got a bit of RF under your belt.
I respect those that have come before and I know there still exists some places where analog is not only the superior option but the only option however almost everything you want to do is ADC then back DAC.
Analog lives in a few niches:
- As you note, signal conditioning to stuff things into an ADC
- Anywhere firmware is viewed as a liability (often medical or other hi-rel stuff)
- Existing proven designs (do not underestimate this sector!)
- Anywhere the cost of the signal conditioning circuitry might be comparable to the cost of just doing it outright in analog. This is mostly the low-cost realm, more rarely ultra-low-power, but sometimes you see it in other places too.
- Line power supplies happen to be all of the above, so you see plenty of analog there
You used to see analog in high-performance stuff (ultra-high-speed/RF or ultra-high-bit-depth), but this has mostly gone into either digital or whackadoodle exotica. Like frontends for those 100GHz oscilloscopes, which are both!
most of analog design nowadays is getting the sign to the ADC, the ADC itself, and the clock generation for the ADC. The ADC is by far the most complex subsystem and it's partially digital partially analog, though the analog parts are also quite algorithmic (binary search, linear search, majority voting, LMS).
The reverse path (DAC) is less common, like 10% of the cases you need a good DAC for signal generation. It's more hardcore analog and harder to design a DAC.
Lol, I wish. I'm a SWE because it pays much more than using my EE degree.
many such cases...
IMO, the biggest reason America lags so far behind in many engineering fields is the success of our SW industry. It's not just EEs. Our SW industry is so successful and so lucrative that it starves other fields of bodies.
If the other fields don't pay well, it's their own fault.
I wrote what I think is a deeper explanation of the phenomenon at https://news.ycombinator.com/item?id=43304835.
Tangentially related:
You can play around with analog programming of a sort with modular synthesizers. It's a pretty neat way to dip your toe into analog signal processing.
This is a great suggestion!
Another couple of ways to get started with analog signal processing:
- Build an AM radio from transistors. There are lots of tutorials out there.
- Simulate circuits with Falstad's circuit.js. There are some interesting analog circuits already in the set of examples, like https://tinyurl.com/24gccg7p.
- Build an Atari Punk.
You can get very, very good op-amps very cheaply these days. Some of them even still come in through-hole packages. This makes it possible to build interesting audio synthesizer circuits for pennies that would have required significant money outlay in the 70s.
That's been my pastime away from the digital screens of my day job. But I didn't go full modular, instead getting 3 analog monosynths. I need some structure while learning music at the same time and can't venture into the wild world of modular. My gear has lots of CV/Gate ins/outs for that later step though.
What is analog? Voltages and circuits and currents, but not digital tubes and transistors?
Most coders in my vicinity are interested in woodworking, is that analog? I think not.
It's a matter of representation. Do the signals represent continuous or discrete quantities? A digital signal represents a discrete quantity such as an integer or symbol, or a sequence of those quantities. Digital systems possess the feature of "noise immunity," where a signal can be unambiguously interpreted due to rules that involve thresholds. For instance you can look up an oscilloscope trace of the signals on a USB or Ethernet cable, and they look horrid, but those signals can transmit information with virtually zero error.
To expand a bit, since my day job involves this stuff, physical stimuli are always analog. Even the discrete energy levels of an atom make their transitions in continuous time. Yet there are good reasons to do virtually all computation in the digital domain, where "noise immunity" allows processing to occur without the introduction of additional noise, and you enjoy all of the other benefits of computer programming.
These days, the job of the analog person is often to understand the physics of the quantity being measured, and the sensor, but to get a signal safely to the front end of an analog-to-digital converter.
Now, the irony is that I actually spend most of my time working in the digital domain. The reason is that analysis of the digital data stream is how I know that my analog stuff is working, and how I optimize the overall system. So if you watched me work for a week, you'd notice that I actually spend a fair portion of my time coding. I just don't write software for other people to use. That's another department, and their work usually starts after mine is done.
The author is specifically talking about designing analog electronic circuits:
> Among them are IoT nodes, sensors and front-ends, actuators including motors, test and measurement instrumentation, boards and interconnects, line drivers/receivers, motor drivers, physical-level wired/wireless links, MEMS devices, RF functions, power sources and systems, EMI, noise, thermal extremes…and that’s just a partial list. These roles and functions are not going away as long as the laws of physics, as we know them, remain in place.
Woodworking can be analog if the wood shapes and positions (and maybe velocities, etc.) are used to quantitatively represent something other than the wood itself, as an analogue to those quantities. For example, you can carve some wooden cams to drive a little automaton, or you can make a clock out of wood gears, where the angles of rotation of the gears represent the amount of time that has passed. But this article is specifically about electronics.
Anything not digital.
Coders already do love it. Terrible premise.
>>> Someone with programming experience could contribute in many of these areas, and still work exclusively at their keyboards and not even getting their hands dirty, if that’s their concern.
I could probably be described as living in the "analog" domain, as a physicist working for a company that makes measurement equipment. Naturally, this could be an ingrained bias, but I've formed the impression that something about getting your hands dirty confers the intuition needed to work productively in this domain. You need to experience being proven wrong by mother nature, over and over again.
Also, if you're sitting at your screen all day, nobody's going to pull you into the loop. It's quicker to just do that stuff ourselves, than to explain it to someone.
So I agree with everything else in the article, because I love analog and love coding. But come on, join us in the lab.
I've got a relative who works at Analog Devices. They're on their third straight month of crunch time, working in 12 hour shifts through the weekends.
Why? Because the dipshits in leadership decided to project the revenue growth during the chip shortage as a straight line for the next 10 years.
Looks like those same dipshits decided the best course of action is to get their soft skulled alumni to write some blog posts to try to herd more cattle into the grinder.
These jobs are far fewer, pay less, and are no more resilient to AI progress making this useless advice.
Hardware might or might not be more resilient against AI in the long run, but for now, AI is sure doing a terrible job at hardware.
It is somewhat ironic that the single profession AI is best at replacing seems to be software engineering.
Not even software engineering, not even in the wettest of dreams of modern dysfunctional ceo that use AI to justify lay offs
I feel like we need to say this more and louder. I'm getting pretty tired of all the breathless ai hype.
Programming in analog won't pay as well at all compared to programming in digital.
So telling people to move over to analog will depress that job market even more than it already is.
I love all things analog other than my macbook.
Smart things drive me completely insane and I find peace with things that just work without a wifi connection or firmware of any kind.
As someone in the EE field. The jobs exist but are not plentiful. The physical engineering fields in the US have very largely shrunk due to offshoring, centralization into major OEMs and general efficiencies in doing work. "Analog" is a very cost sensitive and optimized arena.
Are there decent paying jobs now? I was an EE then CompE major and, while I enjoyed it and think my engineering degrees were solid, all the jobs and better pay were in software. I'm glad I liked to program as a hobby, otherwise my career would have been a lot more challenging.
No. This is a serious problem. The traditional engineers (EE, MechE) don't program, because the ones who do, have gone into software development. The people who remain are freaks like me, who for some reason, weren't interested in becoming software developers. Many of us have a physical science background.
My hypothesis is that if you can produce a ten-million-dollar product using a million dollars of lab equipment, the ten million dollars are probably going to go to the guy who bought the lab equipment, while if you can produce a ten-million-dollar product using your laptop, you can probably keep the dollars.
Indeed, I've given up trying to analyze the market, especially because I'm late-career and at a "staff plus" level. I think to some extent, programmers have a certain exclusivity because only a certain fraction of people can learn to program -- for reasons we don't understand -- and they have a high degree of mobility. This allows programming to function in a fashion akin to a guild, with collective awareness and action about wages and working conditions. Also, I think that open-source tooling plays a role. Ironically, "the workers own the means of production."
As someone in a similar position, I have a different take. I think SWE is compensated more highly for different reasons. I disagree that programming is any more difficult to learn than EE; I think it's pretty clear that they're broadly on the same tier and require similar (but not identical) skill sets. I think that only a small fraction of people are well suited to either.
The biggest difference is that software is just closer to "the product". When something goes wrong on a typical embedded device, the first thing they say isn't "well, better bring in exmadscientist to redesign the board", it's "let's fix this in firmware". And so on and so forth. Most electrical projects just aren't, directly, the project. There's a big layer of software in between, and that software becomes the face of the product or even business and captures so much of the mindshare.
The other big reason is simply that the complexity of software is unbounded. For me, there are only so many parts I can stuff onto a board. But software has no such limits. I was just looking at 2.5GbE Ethernet switch chips for a hobby project -- a hobby project -- and concluded that they weren't bad at all and would take me somewhere around 40 hours to deal with, start to finish. That's for one of the nastier things around in your typical consumer environment (short of a 5GHz CPU) and represents a tremendous amount of investment on so many levels to get things to be not only possible, but down to the level that a seniorish engineer can just do it that quickly.
In contrast, a dinky web interface to manage the stupid thing would also probably be around 40 hours (less if crappy, way more if done "to modern standards"). Which is kind of insane, when you think about how many Gbps SERDES links are in each project! But software can do whatever it feels like, while the hardware side necessarily has constraints on it, so this is what we get.
However, I also think things are pretty imbalanced right now: SWE is somewhat overpaid, and will correct down over time. So it's not a great career to leverage the farm against, though I'd say it's never going to pay worse than EE or ME.
It sounds like you're advocating an alternative explanation to the one I gave, but without understanding that it's different. This probably means that my explanation wasn't very clear.
The workers owning the means of production doesn't seem ironic at all to me in this case; it's textbook Marxist economic theory. The means of production for software are, from Marx's point of view, your laptop and maybe a colo box. When you own (or rent!) them, you aren't alienated from them (in the purely objective sense of https://en.wikipedia.org/wiki/Marx%27s_theory_of_alienation#... rather than any emotional sense) and consequently you own the product. By contrast, when you depend on your employer for access to the means of production, they own the product of your labor.
Open-source tooling, based on the idea that "software should not have owners" (https://www.gnu.org/philosophy/why-free.html), enables every worker to own not only their own computer but their own compiler, linker, etc., rather than depending on angel investors to invest the capital necessary to license them from Computer Associates. You are precisely correct there.
Where my analysis departs from orthodox Marxism (perhaps because I am in fact a liberal) is that I see this as a question of bargaining power rather than a Hegelian mystical thing.
Civil engineers have very little bargaining power because the state power necessary to build a highway is very scarce indeed, while the expertise required to design it is relatively abundant.
Electrical engineers have more bargaining power because they can choose between many potential employers, who have to compete with one another for their relatively scarce skills, but bringing a new electronic product to market still requires a significant investment and several months, if not a year or more. For more advanced electronics like submillimeter ASICs, we're probably talking about several years and tens of millions of dollars.
Programmers have enormous bargaining power because they can bring a salable product to market over the weekend with an investment of a few hundred dollars. Or a few thousand if they're targeting the iPhone.
So I don't think collective awareness and action are what's going on here. I think individual programmers are in a better bargaining position than individual electrical engineers, and consequently they get better bargains individually, without functioning in a fashion akin to a guild. (And collectively owning the means of production, as orthodox Marxism prescribes, would not provide the same benefits. Observably it did not, neither in the Soviet Union and Mao's China nor when US retirees owned the majority of the stock market through their pension funds.)
Incidentally, there have been a lot of innovations over the past 25 years or so that have greatly dropped the investment required to bring a new web service online. Sourceforge and later GitHub and then GitLab eliminated the need to spend a week configuring a server to support a software team. Rackspace and then Amazon Web Services eliminated the need to buy the server, haul it over to the colo, and maybe commit to a service contract. MySQL (now MariaDB), Postgres, and SQLite eliminated the need to license Oracle. (Linux had already eliminated the need to buy a Solaris license 25 years ago, but a lot of people hadn't noticed yet.)
Companies like JLCPCB, PCBWay, and OSHPark seem like they're sort of trying to do the same thing with PCB products, FPGAs and especially Yosys are doing the same thing with digital designs, and companies like Skywater, (the nonprofit) IHP, and Matt Venn (Tiny Tapeout) seem to be trying to do the same thing with ASICs, including mixed-signal ASICs. (I'd list Efabless here, but they seem to have gone out of business last week.)
But being able to pay US$10 for one-week turnaround on a stack of prototype PCBs isn't going to replace a 1GHz LeCroy oscilloscope or a Kuka industrial robot, so I'm not confident that they'll have the same effect.
There's a lot of interesting points that, but you've totally bungled most of the things relating to Marxism, confusing the subject of alienation work it's main mechanism, and confusing Leninism/Maoism—a line of revisions of Marxist theory to attempt to provide a roadmap that bypasses private capitalist development—with “orthodox Marxism”.
Also, orthodox Marxism sees material relation to the capitalist economy as something of a continuum—the three main classes consist of variations of degree of relative importance of two (labor and capital) means of interacting with the economy—the capital-dominant group is the haut bourgeoisie, the labor-dominant group is the proletariat, and the middle class, the petit bourgeoisie, has significant dependence on both labor and capital (the textbook case being someone who applies their own labor, rather than rented labor, to their own capital to produce goods or services, though there are other mixes possible that are also petit bourgeois.) It is purely material, not mystical.
I appreciate the correction! Could you be more specific? I'm not sure what I've gotten wrong exactly, and I'd like to stop getting it wrong. Of course I understand you can't pack Kapital into an HN comment.
That's a nice analysis, and I agree with you about Marx. I'm a liberal too.
Oddly enough I have a side-business that makes an electronic gadget -- think something like a guitar pedal. But I have to choose my battles very carefully to avoid needing any kind of capital investment to speak of, and the real barrier to entry is the knowledge that I've gained from being immersed in the industry. My entire physical capital is less than what an engineer's employer pays per year for a seat of a CAD package.
A very big barrier that software developers don't face is regulatory approval.
It’ll never happen. Analog circuit design takes a level of intelligence that, frankly, most programmers are nowhere near. Digital is black and white by the book and you can trial by error your way to a solution. Analog is an art that actual requires deep knowledge of conflicting parameters. Technically EVERYTHING is analog—there is no such thing as digital—-but coders just don’t have the genetics for analog.
> Analog circuit design takes a level of intelligence that, frankly, most programmers are nowhere near.
I'm skeptical of this comment, and of possible bias given your username.
I'd think roughly the same amount of people that were able to learn to code at more than just the hello world/vb macro level could learn analog circuit design. It's just that the interest isn't there.
Because I know nothing about analog circuits, I find your comment very interesting. Can you give examples of conflicting parameters? And why those parameters can't be simulated away?
Well, let's say your op-amp is too slow. You can get a faster op-amp, but it uses more power and has a higher input bias current. Another op-amp is just as fast and has a thousand times smaller input bias current, because it has JFET inputs, but as a result it has a much higher input offset voltage. Also it doesn't have rail-to-rail inputs, and you need rail-to-rail inputs. Another alternative has a lower offset voltage again (but also larger input bias current, but you resolve to lower the output impedance of the thing that's feeding it so that the bias current isn't a problem) and does have rail-to-rail inputs, but three weeks later you find out it has terrible crossover distortion when the input transitions from using the npn darlington stage it uses for inputs above the positive rail to the pnp darlington stage it uses for inputs below the negative rail, and you curse yourself for not paying more attention to what the old guys were saying. Also it has a lot more noise than the JFET-input op-amp you were thinking about before. And so on.
Basically with a digital circuit you mostly only care about two things: whether it computes the function you want to compute, and how fast it is. Circuits that don't compute what you want to compute can simply be ruled out, and among the circuits that work, the faster the better.† Digital circuits don't have input bias currents, or rather their input bias currents don't introduce error. They don't have dropout voltages or non-rail-to-rail inputs or offset voltages or power supply rejection ratios or noise figures. Either they compute the right answer or they don't.
But in analog design, nothing computes the exactly right answer. Every component introduces errors of different kinds in varying amounts. So, there are a lot of different desirable parameters, everything trades off against everything else, and which parameters matter most depends on the situation. If you're designing a circuit that gets used in a lot of different situations, like a new op-amp IC, you have to kind of guess which of those situations are the most important ones.
I don't think it's true that analog circuit design is harder than programming. Like cooking, how hard it is depends on what you're doing. In all three cases you have problems of a whole range of difficulty from "trivial even for a beginner" to "beyond human capability", and, for more difficult problems, deep knowledge can diminish the amount of trial and error required but never eliminate it.
______
† This is kind of a lie. Slew rates that are faster than you need can cause ground bounce and impact your EMC, but those are analog phenomena and usually of only peripheral interest. Power consumption, another analog phenomenon present in digital circuits, is always a concern if you're on battery, though much more so for analog designs. If you're doing asynchronous logic design, you have to worry about glitches, so faster isn't always better, but almost nobody does asynchronous these days because synchronous logic is so much easier and almost always adequate. Finally, cost trades off against other desirable attributes in any kind of engineering, even digital circuit design. Still, it's a lie that's more true than false.
Getting career advice from boomers is pretty useless. The worlds just too different nowadays.
OK boomer.
Dallas County Community College has a career education program (eg like a trade school) in a field called “mechatronics.”
When the presenter explained, it turns out to be programming and managing the systems that do warehouse / product movement in facilities owned by scrappy little companies like Wal-Mart and Amazon…you know, because humans need bathroom breaks and pesky things like safety considerations. Apparently graduates walk into the field regularly getting $70-80,000 a year jobs, which to me sounds really low. Then again, the program is like 18 weeks and a surrogate for higher education in a field where demand exists.
So in a way the grandpa who wrote this article is right, but little does he know it’s eliminating low skill jobs that his meth addled nephew might be actually qualified to do!