I am disappointed that most of these are just “X with syntax highlighting”. Maybe it's because of that one time Rob Pike has bitten me, and all my jackets turned Hawaiian, but I fail to see why...
I am disappointed that most of these are just “X with syntax
highlighting”. Maybe it's because of that one time Rob Pike has bitten
me, and all my jackets turned Hawaiian, but I fail to see why people are
going mad about it. And since we're talking about command-line programmes,
which often imply Unix, why not just have one programme that
consumes code and other commands' output through stdin and spits out
uglified multicoloured code to stdout? Whatever
happened to composability? At this point I seriously expect Apple to
remove the pipe key completely from future products.
And while we're at it, why the hell do I need
a GPU-accelerated terminal emulator? Bah!
I use quite a few of them at work. They are not just what you say. They are also easier to use than the tools they replace. It saves me from remembering recipes and concoctions to do determinate...
I am disappointed that most of these are just “X with syntax highlighting”.
I use quite a few of them at work. They are not just what you say. They are also easier to use than the tools they replace. It saves me from remembering recipes and concoctions to do determinate actions with the tools, avoiding many usages of man and in general they save time.
Why would you not want a program that you use dozens of times each day to harness your system resources most effectively? To me, this is tantamount to saying 'why the hell do I need an 64-bit...
...why the hell do I need a GPU-accelerated terminal emulator?
Why would you not want a program that you use dozens of times each day to harness your system resources most effectively? To me, this is tantamount to saying 'why the hell do I need an 64-bit terminal emulator' or even 'why the hell do I need an x86 protected mode terminal emulator' - and yes, I know that's hyperbolic, but my point stands - if the tech is there, why not use it?
Following your logic, people should attach jet-packs to their lawnmowers and fill their refrigerators with liquid Helium. “The tech is there”, after all. A GPU is a Graphics Processing Unit....
(…) if the tech is there, why not use it?
Following your logic, people should attach jet-packs to their lawnmowers
and fill their refrigerators with liquid Helium. “The tech is there”,
after all. A GPU is a Graphics Processing Unit.
Graphics, as opposed to text. This is why we have
a distinction between command-line (or, textual) and graphical user
interfaces. Do I want my window manager and my web browser to be
GPU-accelerated? Sure! My terminal? No more than I want a jet-packed
lawnmower or a near-zero-kelvin refrigerator.
Besides, what difference would it really make? Do I really want to have
my cursor blink at 240 FPS? I could somewhat get it in a bizarro world
where bb-like pseudographics are popular, but in my terminal,
which mostly runs Vim with syntax off? What good
would it bring me?
CPUs do not like moving images around. They're not optimized for it, and they consume more power than a GPU performing the same task. GPUs are very, very good at moving images around and...
CPUs do not like moving images around. They're not optimized for it, and they consume more power than a GPU performing the same task. GPUs are very, very good at moving images around and compositing them.
You can read more about how and why Alacritty is fast, but the short version: ~500 frames per second thanks to OpenGL; intelligent budgeting of paint time vs. parse time to allow for parsing huge (GB-sized) files; and no frames are drawn except when state changes. This adds up to a super-fast, battery-friendly emulator that supports large files and works on (almost) every platform.
If my refrigerator already had liquid helium in it anyway, and it would perform tasks better by utilizing that liquid helium, and there would be no additional costs or annoyances to me through its...
Following your logic, people should attach jet-packs to their lawnmowers and fill their refrigerators with liquid Helium. “The tech is there”.
If my refrigerator already had liquid helium in it anyway, and it would perform tasks better by utilizing that liquid helium, and there would be no additional costs or annoyances to me through its usage of liquid helium, then yes - I would like my refrigerator to make use of liquid helium.
What I mean by 'the tech is there' is that your machine surely already has a GPU in it, and as long as it's not being utilized, it's just a redundant lump of silicon. Your absurd metaphors obviously do not address my point for this reason; no lawnmower comes with a jet-pack; jet-packs are loud, dangerous, and expensive; and there is no way in which a jet-pack can reasonably aid the operation of a lawnmower (or maybe I just lack imagination...).
Your emphasis of 'Graphics Processing Unit' is a bizarre one, given that anyone with an understanding of modern computing realizes that the term is to some extent an anachronism. While it's true that graphics processing still represents a large proportion of the computation for which GPUs are employed, it is an inarguable fact that the modern GPU could more accurately be described as a parallel processing unit, so limitless is the range of roles to which such hardware may effectively be applied. Even the most ignorant of teenage gamers knows this - if only because they were forced by deep-pocketed crypto miners to take a shiny new Nvidia-branded brick off their Christmas list. When it comes to prospective performance gains in our post-Moore world, process shrinks and clock speeds offer scant pickings, yet there is relative abundance in the orchard of parallelization. Perhaps you think that out of the millions of flagship graphics cards filling high-tech warehouses across the globe, relentlessly slaving away at the coalface of the blockchain and chewing through terabytes of data in the quest to better translate Catalan into Haitian Creole, every single one is also running an instance of Quake 3 in a quest to fulfill a purpose that is in some way graphical. Or perhaps you do not think that - since you'd need to be clinically insane to do so - in which case, maybe you can see that the performance gains of massively parallelized computation have exactly zero relevance to whether you prefer textual or graphical interfaces.
I am disappointed that most of these are just “X with syntax highlighting”. Maybe it's because of that one time Rob Pike has bitten me, and all my jackets turned Hawaiian, but I fail to see why people are going mad about it. And since we're talking about command-line programmes, which often imply Unix, why not just have one programme that consumes code and other commands' output through stdin and spits out
uglifiedmulticoloured code to stdout? Whatever happened to composability? At this point I seriously expect Apple to remove the pipe key completely from future products.And while we're at it, why the hell do I need a GPU-accelerated terminal emulator? Bah!
</rant>
I use quite a few of them at work. They are not just what you say. They are also easier to use than the tools they replace. It saves me from remembering recipes and concoctions to do determinate actions with the tools, avoiding many usages of
man
and in general they save time.Why would you not want a program that you use dozens of times each day to harness your system resources most effectively? To me, this is tantamount to saying 'why the hell do I need an 64-bit terminal emulator' or even 'why the hell do I need an x86 protected mode terminal emulator' - and yes, I know that's hyperbolic, but my point stands - if the tech is there, why not use it?
Following your logic, people should attach jet-packs to their lawnmowers and fill their refrigerators with liquid Helium. “The tech is there”, after all. A GPU is a Graphics Processing Unit. Graphics, as opposed to text. This is why we have a distinction between command-line (or, textual) and graphical user interfaces. Do I want my window manager and my web browser to be GPU-accelerated? Sure! My terminal? No more than I want a jet-packed lawnmower or a near-zero-kelvin refrigerator.
Besides, what difference would it really make? Do I really want to have my cursor blink at 240 FPS? I could somewhat get it in a bizarro world where
bb
-like pseudographics are popular, but in my terminal, which mostly runs Vim withsyntax off
? What good would it bring me?CPUs do not like moving images around. They're not optimized for it, and they consume more power than a GPU performing the same task. GPUs are very, very good at moving images around and compositing them.
See Isaac Lewis's post:
https://ike.io/a-gpu-accelerated-terminal-emulator-why/
If my refrigerator already had liquid helium in it anyway, and it would perform tasks better by utilizing that liquid helium, and there would be no additional costs or annoyances to me through its usage of liquid helium, then yes - I would like my refrigerator to make use of liquid helium.
What I mean by 'the tech is there' is that your machine surely already has a GPU in it, and as long as it's not being utilized, it's just a redundant lump of silicon. Your absurd metaphors obviously do not address my point for this reason; no lawnmower comes with a jet-pack; jet-packs are loud, dangerous, and expensive; and there is no way in which a jet-pack can reasonably aid the operation of a lawnmower (or maybe I just lack imagination...).
Your emphasis of 'Graphics Processing Unit' is a bizarre one, given that anyone with an understanding of modern computing realizes that the term is to some extent an anachronism. While it's true that graphics processing still represents a large proportion of the computation for which GPUs are employed, it is an inarguable fact that the modern GPU could more accurately be described as a parallel processing unit, so limitless is the range of roles to which such hardware may effectively be applied. Even the most ignorant of teenage gamers knows this - if only because they were forced by deep-pocketed crypto miners to take a shiny new Nvidia-branded brick off their Christmas list. When it comes to prospective performance gains in our post-Moore world, process shrinks and clock speeds offer scant pickings, yet there is relative abundance in the orchard of parallelization. Perhaps you think that out of the millions of flagship graphics cards filling high-tech warehouses across the globe, relentlessly slaving away at the coalface of the blockchain and chewing through terabytes of data in the quest to better translate Catalan into Haitian Creole, every single one is also running an instance of Quake 3 in a quest to fulfill a purpose that is in some way graphical. Or perhaps you do not think that - since you'd need to be clinically insane to do so - in which case, maybe you can see that the performance gains of massively parallelized computation have exactly zero relevance to whether you prefer textual or graphical interfaces.