For those veteran linux people, what was it like back in 90s? I did see and hear of Unix systems being available for use but I did not see much apart from old versions of Debian in use.

Were they prominent in education like universities? Was it mainly a hobbyist thing at the time compared to the business needs of 98, 95 and classic mac?

I ask this because I found out that some PC games I owned were apparently also on Linux even in CD format from a firm named Loki.

  • nucleative@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Driver support was so dicey. If you had anything even remotely not mainstream, you would be compiling your own video driver, or network driver, or basically left to figure it out for any other peripheral. So many devices like scanners and very early webcams just claimed zero Linux support at all, but you could at times find someone else’s project that might work.

    I tried to switch to Linux as a desktop system several times in the late 90s but kept going back to windows because hardware support just wasn’t there yet.

  • SeikoAlpinist@slrpnk.net
    link
    fedilink
    arrow-up
    1
    ·
    10 months ago

    It was kind of an upstart thing and people were trying to find ways to monetize it.

    My first Linux was Red Hat on a 486 in 1998 and it was different than I was used to. I was a kid who didn’t know how to startx so I just emailed a developer using pine and they helped me figure out and choose a window manager. Nobody even got mad at this barely teenager just emailing dumb questions. I got lost with fvwm95 and afterstep. I tried every window manager, mlvwm, qvwm, IceWM, etc but ended up liking blackbox the most. I had 12MB of RAM on my first Linux system, 1MB of vram and 256 colors. We were all sarcastic in a cringe, adolescent way but everyone was friendly and helpful.

    There was this fascination with monkeys in pop culture, but not real monkeys–chimps and gorillas. People would throw monkey in their username or in some random nu-metal song for some reason. There were monkeys you could download for your desktop. There was this thing by PC gamer called coconut monkey. I don’t know what that’s all about. And anyway I associate this period with the foot logo of Gnome, which was unprofessional but that was the point. Also, gimp was a funny name for an app, and pan stood fo pimp ass news.

    I discovered Slashdot and Freshmeat and Sourceforge and kuro5hin. Usenet groups were great back then. So was irc. I trolled Slashdot and got negative karma and for the next 15 years before we all moved to SoylentNews, my comments started at -1.

    Nobody knew how to pronounce Linux. Some people said Line-X because his name was Linus like on Charlie Brown, and some people said Leenucks.

    At some point it became a corporate thing and the term Linux was everywhere. Randomly on magazine covers. There was also this divide, almost marketing driven, it seemed that people who liked warez and whatever started to love Microsoft and shit on Linux. So gamers especially started to shit talk and that’s the first time that being a computer nerd wasn’t like this unifying concept, there was an us versus them divide. People who could compile code they wrote and who were genuinely curious versus people who just wanted to download a bunch of shit and show you how big their start menu was and play games. I think this divide still exists.

    There was a bunch of commercial software for Linux too. Metro-X, Accelerated X, Motif, Applixware, Star Office. Descent 3. One of the Quakes. Motif, the toolkit, looked amazing. I thought CDE with themes was the coolest looking thing ever. But I couldn’t afford CDE so I used XFce which was an XForms knockoff. And then enlightenment came along and pushed the boundaries of what we thought a desktop would be. Also, I was able to drag console windows with transparency on that 486 on e16.

    Debian kind of had an elitist community and talked down to people so I never used it. I liked Slackware the most and spent a weekend downloading the floppies over a dialup connection. That led to me discovering FreeBSD in 1999, which I stuck with for almost a decade.

    Later, a comp sci student, I didn’t see Linux at university in the labs. It was Solaris and macOS in the mid 2000s. Eventually, the Solaris computers were shut down and replaced with more Macs.

    My girlfriend’s Windows ME computer was so full of spyware so I installed SuSE with KDE on it for her in her dorm. And she was able to do her papers in AbiWord. And 20+ years later we are married and it all worked out.

    I finally switched to Debian stable about 4 years ago and have no complaints. It’s a lot easier now.

    • InfiniteKrebs@lemmy.ml
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      Wow, thanks for sharing all that, was well written and really allowed for a peek into what it was like!

  • LeFantome@programming.dev
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    10 months ago

    Well, XFree86 ( before Xorg and before KMS ) was an adventure. I spent hours guessing the vertical and horizontal frequencies of my monitor trying to get decent resolutions.

    Other than that, Linux was way more work but “felt” powerful relative to OS options of the time. Windows was still crashy. The five of us that used OS/2 hated that it still had a lot of 16 bit under the hood. Linux was pure 32 bit.

    Later in the 90’s, you could run a handful of Windows apps on Linux and they seemed to run better on Linux. For example, file system operations were dramatically faster.

    And Linux was improving incredibly rapidly so it felt inevitable that it would outpace everything else.

    The reality though was that it was super limited and a pain in the ass. “Normal” people would never have put up with it. It did not run anything you wanted it to ( if you had apps you liked on Mac, Windows, OS/2, Amiga, NeXTstep, BeOS, or whatever else you were using ( there were of potential options at the time ). But even for the pure UNIX and POSIX stuff, it was hard.

    Obviously installation was technical and complex. And everything was a hodge-podge of independently developed software. “Usability” was not a thing. Ubuntu was not release until 2004.

    Linux back then was a lot of hitting FTP sites to download software that you would build yourself from source. Stuff could be anywhere on the Internet and your connection was probably slow. And it was dependency hell so you would be building a lot of software just to be able to build the software you want. And there was a decent chance that applications would disagree about what dependencies they needed ( like versions ). Or the config files would be expected in a different location. Or the build system could not find the required libraries because they were not where the Makefile was looking for them.

    Linux in the 90’s had no package management. This is maybe the biggest difference between Linux then and Linux now. When package management finally arrived, it came in two stages. First, came packages but you were still grabbing them individually from FTP. Second came the package manager which handled dependencies and retrieval.

    The most popular Linux in the mid to late 90’s was Red Hat. This was before RHEL and before Fedora. There was just “Red Hat Linux”. Red Hat featured RPMs ( packages ) but you were still installing them and any required dependencies yourself at the command line. YUM ( precursor to NRF ) was not added until Fedora Core 1 was release in 2003!

    APT ( apt-get ) was not added to Debian until 1998.

    And all of this meant that every Linux system ( not distro — individual computer ) was a unique snowflake. No two were alike. So bundling binary software to work on “Linux” was a real horror-show. People like Loki gave it a good run but I cannot imagine the pain they went through. To make matters worse, the Linux “community” was almost entirely people that had self-selected to give up pre-packaged software and to trade sweat-equity for paying for stuff. Getting large number of people to give you money for software was hard. I mean, as far as we have come, that is still harder on Linux than on Windows or macOS.

    You can download early Debian or Red Hat distros today if you want to experience it for yourself. That said, even the world of hardware has changed. You will probably not be wrestling IRQs to get sound or networking running on modern hardware or in a VM. Your BIOS will probably not be buggy. You will have VESA at least and not be stuck on VGA. But all of that was just “computing” in the 90’s and the Windows crowd had the same problems.

    One 90s hardware quirk was “Windows” printers or modems though where the firmware was half implanted in Windows drivers. This was because the hardware was too limited or too dumb to work on its own and to save money your computer would do some of the work. Good luck having Linux support for those though.

    Even without trying old distros, just try to go a few days on you current Linux distro without using apt, nrf, pacman, zypper, the GUI App Store, or what have you. Imagine never being able to use those tools again. What would that be like?

    Finally, on my much, much slower 90’s PC, I compiled my own kernel all the time. Honestly multiple times per month I would guess. Compiling new kernels was a significant fraction of where my computing resources went at the time. I cannot remember the last time I compiled a kernel.

    It was a different world.

    When Linus from LTT tried Linux not that long ago ( he wanted to game ), he commented that he felt like he was playing “with” his computer instead of playing “on” his computer. That comment still describes Linux to some extent but it really, really captures Linux in the 90’s.

  • Blaster M@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Ah, yes, Linux around the turn of the century. Let’s see…

    GPU acceleration? In your dreams. Only some cards had drivers, and there were more than 2 GPU manufacturers back then, too… We had ATi, nVidia, 3dfx, Cirrus, Matrox, Via, Intel… and almost everyone held their driver source cards close to their chest.

    Modems? Not if they were “winmodems”, which had no hardware controller, the CPU and the Windows driver (which was always super proprietary) did all the hard work.

    Sound? AC’97 software audio was out of the question. See above. You had to find a sound blaster card if you wanted to get audio to work right.

    So, you know how modern linux has software packages? Well, back then, we had Slackware, and it compiled everything gentoo style back then. In addition, everyone had a hardon for " compiling from source is better"… so your single core Pentium II had to take its time compiling on a UDMA66-connected hard drive, constrained with 32 or 64 MB RAM. Updating was an overnight procedure.

    RedHat and Debian were godsends for people who didn’t want to waste their time compiling… which unfortinately was more common even so, because a lot of software was source only.

    Oh, and then MP3 support was ripped out of RedHat in Version 9 iirc, the last version before they split it into RHEL and Fedora. RIP music.

    As for Linux on a Mac, there was Yellowdog, which supported the PPC iMacs and such. It was decently good, but I had to write my own x11 monitor settings file (which I still have on a server somewhere, shockingly, I should throw it on github or somewhere) to get the screen to line up and work right.

    Basically, be glad Linux has gone from the “spend a considerable amount of time and have programming / underhood linux knowledge to get it working” to “insert stick, install os, start using it” we have now.

    • MonkderDritte@feddit.de
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      In addition, everyone had a hardon for " compiling from source is better"

      I mean, optimization had more of an impact on the weak CPU’s back then, no?

      • Blaster M@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        That only matters if there’s anything to optimize by source compilation. If the program doesn’t have optimization features in the source, it’s wated time and energy.

  • atzanteol@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    10 months ago

    Everything was harder back then - even when using Windows. But you had to be a real masochist to run Linux.

    Computers were still quite new that most people had no real use for them beyond “work things”. Only nerds really used them for anything else. “Do you have an email address” isn’t a question you ask today.

    “Kids these days” don’t realize how easy they have it when it comes to just general comparability. There weren’t a lot of standards yet and vendors had proprietary drivers and offered no support AT ALL for “lye nux”. You had to do a ton of research and fiddling to figure out if there was any support for your specific version of a specific chip used by any peripheral you used. And then to discover that you had to patch your kernel to add a driver that somebody had bodged together. So now you were running your own fun custom-kernel so you could get full-duplex rather than simplex audio! But it works!

    Like - lets say today you want to buy an external IDE drive controller to plug in some old drives to for backup. You to to Amazon, search “USB external IDE enclosure” and buy the cheapest one you find. It probably works unless it’s defective. In '95 USB and Firewire were in their infancy so you would probably buy a serial or parallel port device. You would need to find whether Linux supported the specific version of the thing you wanted to buy, what tools there would be for it, etc. There was no standard “bulk storage device” driver that you could rely on or hope the vendor would implement. Even if you were an early adopter and got a USB or Firewire device it might have some “basic” functionality that works with OSS drivers but you couldn’t use all of it.

    Vendors back then also shipped their own software with things, not just drivers. It was always just the absolute worst crap that was buggy as shit. But it would do a lot of the heavy lifting in working with their device. Like any Creative Labs audio player you wanted to get working. Sure it used USB but it didn’t just mount as storage device, you needed to use the worst GUI ever put before mankind to use it (under Windows). Under Linux you had to find a specific tool that would support pushing/pulling media from it. These days it would just mount as a drive automatically and you’d use standard desktop tools to interact with it.

    Even with DOS/Windows you’d buy a game and as you came home from the store with it in a box wonder “will this work on my computer and how long will I need to mess with it?” I had to configure a specific CD-ROM driver to be used by DOS just to run Tie Fighter vs. X-Wing for example. Had a special boot floppy just for that game since that driver didn’t work with literally anything else I had.

    Hardware just generally didn’t “auto configure”. “Plug 'n Play” was still very much in its infancy and you often had to manually configure hardware and install special drivers just for a particular card or peripheral.

    IRQ 7 DMA 220. I probably just triggered some folks. If you were setting up a “Sound Blaster or compatible” then you had to know what interrupt it used (7) and what address it was on on the direct-memory bus (220). And you hoped there wasn’t a conflict with something else. If there was then there would be a DIP switch you could use to change the base memory address or IRQ from the default. But you were telling your software where to find the card.

    USB was a f’ing game changer for peripherals. Serial and parallel ports were so slow and obnoxious to use. Before that there was no real way to “probe” the bus to discover what was there unless you knew exactly what you were looking for (there’s no lsusb for serial ports). So you just guessed at the driver you need and “modprob foo” hoping it worked.

    It’s amazing what 20ish years of just developing standards has done.

    If you want a taste of that world I highly recommend LGR on YouTube. He’s mostly Windows focused but look for videos where he tries out “oddware” to see how often he has trouble getting things to work on period hardware using the vendor-supplied software even. Then multiply that by 100x for Linux. :-)

    • DAMunzy@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      I was reading your wall of text chomping at the bit to complain about IRQs and dip switches but you covered even that!

      Oh wait, you didn’t include having a math coprocessor daughter boards! I barely remember them but remember my dad building computers with them.

      I kinda wish I was a teen when the first computer kits were coming out. And phone phreaking.

    • MonkderDritte@feddit.de
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      Even if you were an early adopter and got a USB or Firewire device it might have some “basic” functionality that works with OSS drivers but you couldn’t use all of it.

      Oh, like scanners still!

      • atzanteol@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        Scanners and printers are one area of computing that have always sucked the most relative to other things. They’re better these days but they’re still the one thing I expect to fail on a regular basis.

    • Telorand@reddthat.com
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      I wasn’t that into computers at that point in my life, but it was definitely a time where “computers” was a hobby, in the same way that restoring old motorcycles was/is a hobby. Sure, you might take it out for a spin every now and again, but a lot more time is spent tinkering than simply using.

      I’m constantly amazed by how much better the end-user experience is today, even just from 10 years ago. The installers are better, the pre-configured software and settings are more thoughtfully chosen, and now we’re at the beginnings of meaningful Linux gaming for non-hobbyists.

      We truly stand upon the shoulders of giants, and I look forward to the future.

      • atzanteol@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        Gaming has been the biggest change in the last 10 years or so. Mostly thanks to Steam. It’s easier to game on Linux these days than it is MacOS! It’s crazy.

    • aksdb@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      I remember buying a bunch of old HP ISA 100Mbit NICs to turn an old computer into a router/server combo. Naive as I was I put them all in and nothing worked. Turns out they were all configured to use the same IRQ (since they likely came from independent machines), and that caused them to overwrite each others settings… including the MAC adress. Thankfully I found some nice hacker that worked with these cards before and published a little C tool to rewrite their EEPROMs. I contacted him if he sees a chance to resurrect the cards and that saint indeed hacked the necessary features into his tool so I could rewrite the MAC adresses, change the IRQ one by one and ended up with a working network. Good times.

    • Cyborganism@lemmy.ca
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      LoL!!! IRQ 5 DMA 220 for me. Had to manually adjust the jumper on the sound card.

      Fucking hell…

      • Jesus_666@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        10 months ago

        Port 220.

        IRQ 5, port 220h, DMA 1 was what I used for my SoundBlaster 2.

        Later I used IRQ 5, port 220h, DMA 1, high DMA 5 for my SoundBlaster 16.

        • Cyborganism@lemmy.ca
          link
          fedilink
          arrow-up
          0
          ·
          10 months ago

          Do you think it’s worth getting a Sound Blaster card today? I’ve read you can get better sound effects in game. Can’t the on board audio chips do that now?

          • Jesus_666@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            10 months ago

            I gotta be honest, I haven’t used a dedicated sound card since the Vista/7 era when EAX stopped being a thing and onboard sound could handle 5.1 output just fine. The last one I had was a SoundBlaster Audigy.

            These days the main uses for dedicated sound interfaces are for when you need something like XLR in/out and then you’ll probably go with something USB.

    • arran 🇦🇺@aussie.zone
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      This. However from about the release of knoppix and ubuntu things started looking and feeling a lot more like they are today. – I credit that to Knoppix for X & filesystem work and Ubuntu for setup and everything desktop.

      So even though late 90s it was tough, it was nothing like mid 90s. But by around 2004-2005ish the install and setup was substantially easier however the reputational damage still exists to today.

      I remember spending a lot of time in XFree86 config files, re-configuring it trying to figure out what works best on my monitor, and then the migration to XOrg. All good times.

      There was however a substantial amount of hype around Linux. It wasn’t quite what it is with AI. But you couldn’t read a magazine without encountering it in some way, but it was the type of hype were everyone knew of it but few people had anything to do with it.

      Another thing that hadn’t been mentioned is that there was a new distribution cropping up every day or so. (It felt like at least.) But this seems to back up that statement: https://en.m.wikipedia.org/wiki/File:Linux_Distribution_Timeline.svg

  • Scipitie@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    1
    ·
    10 months ago

    You have several long and comprehensive answers so please allow me to add an emotional one:

    Fucking compile error in hour six of what you estimated to be a four hour compile job because of a mistake you made that you found within 5 seconds after the error!!

    Fucking why doesn’t this compilation start I can’t find my mistake for hours?!

    Where does this module come from?! What do you mean “root kit”? Learning was fun!

    It all was fun! :)

    • mrvictory1@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      10 months ago

      I recently tried to compile Crossover’s Wine from source. I went through many compile errors, fixed dependency issues and started over etc. In the end I think I compiled vanilla Wine imstead of Crossover (there are different source tarballs) because O365 still refused to install lol.

  • Blizzard@lemmy.zip
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Not a veteran, but… During the 90s, while still in primary school, a friend of mine bought a Chip magazine with a CD attached and instructions inside the magazine how to install a mysterious thing called “Linux” from said CD. It was supposed to be something like Windows 95, but new, better and it had a Penguin on it, so we decided to try it.

    We followed magazine’s installation guide to the letter (or at least we thought so) until the installation stuck at error saying KERNEL PANIC!!! and wouldn’t let us finish. We didn’t understand English much back then, but we found the panicking kernel hilarious. Anyway, we figured it’s been enough h4Ck!nG for that day and got back to playing Diablo 1.

  • mortalic@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    10 months ago

    This was me, you’re talking about me. 😂 In the 90’s Linux was barely getting started but slackware was probably the main distro everyone was focused on. That was the first one I ran across. This was probably late 90’s, I don’t remember when slack first came about though.

    By the time the 2000’s came around, it was basically a normal thing for people in college to have used or at least tried. Linux was in the vernacular, text books had references to it, and the famous lawsuit from SCO v IBM was in full swing. There were distro choices for days, including Gentoo which I spent literally a week getting everything compiled on an old Pentium only for it to not support some of the hardware and refuse to boot.

    There was a company I believe called VA Linux that declared that year to be the year of the Linux desktop. My memory might be faulty on this one.

    Loki gaming was a company that specialized in porting games to Linux, and they did a good job at it but couldn’t make money. I remember being super excited about them and did buy a few games. I was broke too so that was a real splurge for me. I feel like they launched in the 90’s (late) and crashed in the early 2000’s.

    • constantokra@lemmy.one
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      10 months ago

      I think you need to qualify that having used or tried Linux in college was normal in the 2000s for someone in computer science or engineering, or basically my fellow undiagnosed autistics and autistic adjacents. In my experience it was fairly normal in college for most people to have trouble operating a basic word processor, and they would not have had any idea what Linux was at all.

      • mortalic@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        10 months ago

        Maybe, but I took some business courses too and even some of them had at least tried a Linux distro. I think it was more widespread than just turbo nerds and cs majors. Hell one of the biggest Linux guys I knew was an anthropology major.

  • kbal@fedia.io
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    My linux experience:

    1993 - Hey, there’s a new Unix-like thing for the PC. You can check it out down at the university computer club.

    1994 - Wow, I finally managed to get X running

    1996 - It was somewhat normal for more nerdy software developers to run linux full-time on their desktop at work.

    1998 - Linux was taking over servers to the point where you rarely saw Solaris, HP-UX, AIX around any more.

    2002 - Everyone agreed that linux was pretty much ready to take over the desktop as well.

  • RedWeasel@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    My community college(1997) had a Suse linux computer lab that I learned on. It was mostly used as a networking/server and programming platform.

    Loki was the leading porting developer at the time.

  • Thorry84@feddit.nl
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    I started with Suse 5 when it came out, as something I was interested in fucking about with. I didn’t have internet access at that time, but I did had a couple of books about it (the distro came with a book as well). It was a couple of CDs and a boot floppy disk (booting from CD wasn’t really a thing).

    I used it for years for software development and simple tasks like Word processing. Getting my printer working on the thing was a chore, as was basically anything. Especially without internet solving issues was sometimes simply impossible. My scanner simply didn’t work. Getting the desktop environment to run was very hard, I struggled with it for a long time. And once I got it working properly, I got a new videocard and it broke the whole thing again.

    The system was very painful to use, it was super cool, but almost nothing ever worked right. And trying to fix shit usually made it worse. But once you did get it working right, it was simply awesome. And the feeling of accomplishment was awesome after finally getting something right. For software development on the terminal it was pretty awesome though. Back then I did almost everything in text mode, as I was used to DOS before that. Going into Windows was something you did only sometimes with Windows 3.11 (and even 95) and I did the same in my Linux environment. The desktop environment used up a lot of memory and was pretty slow, so I preferred the console. It was only later booting into the desktop became the norm (around the Windows 98 era).

    I used Suse till version 6.1 (still have that box). I bought version 7 (still have that box as well), but never really used it.

    Back then I used Debian to create small internet routers for my friends. I got an old compact computer, put in a floppy with Debian, a couple of network cards and created small NAT boxes like that. This was before NAT routers were the norm, people just had internet on 1 machine, connected directly. But as computers became cheaper, a lot of folk had more than 1 computer in the home. With no real way to share the internet connection between the different computers. Microsoft created the Internet Connection Sharing feature, but that was pretty slow, disconnected often and ate resources on your “main” PC. So my little boxes worked great, I helped people setup a home network, connected my magic box to get every system online. Also helped them setup some port forwarding for the stuff they used.

    Because I used Debian a lot, I switched over to Debian for my main rig when Suse 7 released. Used Potato, Woody, Sarge and Etch a lot. Switched around between Debian and Ubuntu in the Lenny and Squeeze era. Have been using Ubuntu ever since, never really had a reason to switch. Debian compared to Suse was so nice, I really liked the way Debian did things. It made a lot more sense for me in my head compared to Suse.

    As I fucked around with computers a lot, I always had both Linux and DOS/Windows machines running and even had a couple of dual boot systems. For any kind of gaming DOS/Windows was required back then and I did love to game. I do think Windows 10 will be my last Microsoft OS, since Windows 11 absolutely sucks (use it at work, I hate it). Work stuff has become less and less of an issue to get stuff done on Linux just as well as on Windows. And gaming has come leaps and bounds due to the work on the Steamdeck.

    So hope to fully ditch Microsoft in the near future, even though my first ever computer in 1984 ran Microsoft firmware with Microsoft Basic being the default user interface.

    • constantokra@lemmy.one
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      Did you ever dual boot Linux and windows, and also have VMware installed in both so you could boot the other one from inside whichever you had booted? Because I spent an insane amount of time screwing around with that for as excruciatingly slow as it was back then.

  • The Zen Cow Says Mu@infosec.pub
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    Way back in the early 90s I needed to use LaTeX for university. The dos version was awful and couldn’t handle large documents. So the options were (1) a nextcube for $$$$, (2) Nextstep 3.3 for PCs for $$$ (some faculty had this), or (3) linux. So I downloaded slackware on dozens of disks.

    You had to configure the kernel, which wasn’t too hard since the autoconfig walked you through it. The hardest part was setting up X11, which required a lot of manual config, and if you screwed up the timings you could destroy a CRT monitor. OpenStep was an option, so there was a moderately friendly windowmanager available.

    Learning Emacs was also fairly unpleasant, but that was the best option for editing TeX at the time.

    Everything would work, until it suddenly would break. But nonetheless I was somehow able to get that thesis done.

    Ugh, modern linux is SOOOOOOOOOOOOOOOOOO much better

    • JaxNakamura@programming.dev
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      So I downloaded slackware on dozens of disks.

      This is no joke. When I downloaded Slackware in '95 or '96, it was over 100 3.5" floppies of 1.44 MB each. And there were still more available, those were just the ones I thought I’d need. And before you could even begin installing, each of those had to be downloaded, written and verified because floppies were not terribly reliable.

    • Aceticon@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      Just to add to this, early on there was no such thing as kernel modules, so you had to compile your own kernel with the hardware support you needed for anything beyond basic (if I remember it correctly, it was only basic processor stuff, keyboard and text mode VGA) hardware support.

  • gnuplusmatt@reddthat.com
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    10 months ago

    I can’t remember if it was 99 or 2000, I got a copy of Red Hat 6.0 (Hedwig) on the cover of a magazine and installed it. I remember the Lilo boot manager giving me trouble and then it was multiple days of dialing up the internet on my dad’s PC to find info on getting X11 to run correctly on my graphics hardware. Once I got that going it was my win modem that defeated me in the end, couldn’t get any internet. So was back to Windows for another couple of years.

    In 2003 my university course had a Linux Administration subject and the lecturer had built a live cd of Fedora Core 2 (this was in the days before live cds were a regular thing) it was a revelation and it worked with much less setup. We had a Linux lab, but the livecd allowed us to work on Linux on our personal machines. I’d dabble with Linux and explore distros for a few years, depending on hard ware compatibility, I’d always have at least one Linux box. I remember attempting to get HalfLife 2 running in Cedega (a commercial fork of wine), even played the original left4dead with friends, this was in 2008. I was there when pulse audio launched before it was ready and when KDE moved to version 4 and was an absolute resource hog. I bought the unreal and tournament games on disc to play on Linux. Was Disappointed when the UT3 release got delayed and then eventually canceled. I remember going to the id software ftps to get the Linux binaries for all the quakes. There were a few other Linux adventures in there, like a misguided attempt at compiling Gentoo in 2007 and working out mythtv server as a media pc and pvr.

    Was excited when I got beta access to steam in 2012, and I haven’t had Windows on my personal computers since then.

  • HarriPotero@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    Slackware and Red Hat were the two distros in use in the mid 90s.

    My local city used proper UNIX, and my university had IRIX workstations and SunOS servers. We used Linux at my ISP to handle modem pools and web/mail/news servers. In the early 2000s we had Linux labs, and Linux clusters to work on.

    Linux on the desktop was a bit painful. There were no modules. Kernels had to fit into main memory. So you’d roll your own kernel with just the drivers you needed. XFree86 was tricky to configure with timings for your CRT monitors. If done wrong, you could break your monitor.

    I used FVWM2 and Enlightenment for many years. I miss Enlightenment.

    • mrvictory1@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      I used Enlightenment on Arch Linux for a year, in 2020-21. The PC had 4G ram and an HDD, Enlightenment was blazing fast. I could type enlightenment_start to a tty and reach a Wayland desktop under a second with 250M ram used total. E is still alive and kicking.

    • andrewth09@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      10 months ago

      If done wrong, you could break your monitor.

      You mean your graphic drivers, right? not your actual hardware…

      • Truls@mastodon.social
        link
        fedilink
        arrow-up
        0
        ·
        10 months ago

        @andrewth09 I bricked a monitor when I tried to fiddle with the graphics settings in Linux back in the late 90s (tried to get it to run on 1280*1024 - which was considered “hi resolution” back then). I had to buy a new monitor. Then installed Windows and only returned to Linux a long time after that.

    • constantokra@lemmy.one
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      How wrong did you have to be to break your monitor? Because I’m positive I got it very wrong a whole lot of times and never managed that.

      • AnUnusualRelic@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        10 months ago

        I managed to make mine do some very worrying noises, but none of my monitors broke either, even though the bandwidth I based my calculations on was often kinda made up.

      • cmnybo@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        By the late 90’s most monitors were smart enough to detect when sync speed was too far off and not try to display an image.
        It was the old monitors that only supported a single or fixed set of scan rates that you had to worry about damaging. Some could be very picky and others were more tolerant.

  • vfreire85@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    the first contact i had with linux back in mid-90’s brazil was with my isp’s login terminal, which displayed some arcane text reading “red hat linux version x.x”. after that, during my father’s final years working in bank of brazil he had to deal with cobra’s homemade distro in his workstations (cobra had developed an unix in the 80s that run on m68k’s, so no surprises here). it was an absolutely esoteric system to those who only knew the dos/windows 3.11 duo, since w95 only arrived in our country in numbers only in 96. the thing really caught on during the early to mid-2000’s, with faster and cheaper adsl connections, and with them, abundant knowledge and downloads available to any script kid.

    • lfromanini@feddit.nl
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      I remember using Conectiva Linux in Brazil. Also tried Kurumin Linux, both Brazilian distros. The biggest pain I recall from these years was to make a modem work and I ended up buying an expensive US Robotics, which worked like a charm.