• trevor@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    2 months ago

    For those that are, for some reason, incredulous of having more performant software (???), here’s a simple program to demonstrate the point:

    use std::{
        fs::File,
        io::{BufWriter, Write},
    };
    
    fn main() {
        let buf = File::create("/dev/stdout").unwrap();
        let mut w = BufWriter::new(buf);
        let mut i = 0;
    
        while i <= 100000 {
            writeln!(&mut w, "{}", i).unwrap();
            i += 1;
        }
    }
    

    It simply prints the numbers 0-100000 to the screen. Compile it (rustc path-to-file). Run it in a non-accelerated terminal with time ./path-to-bin. Now time that same binary in a terminal emulator with GPU-acceleration.

    The difference becomes more apparent with more text. Now, imagine needing to use something like find on a large set of files. Doing this on a non-accelerated terminal is literally slower.

    It’s fine if you don’t need a GPU-accelerated terminal, but having acceleration is genuinely useful and a noticeable quality-of-life improvement if you do anything more than just basic CLI usage.

    • asdfasdfasdf@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      Isn’t the terminal only going to affect performance when it’s displayed in stdout? I’d think a program like find / using pipes would send the data under the hood and all that the terminal would deal with would be the output of the entire command.