Thirty+ years ago was a time of big changes. There were a bunch of companies competing for the "supermicro" and workstation business, both in hardware, and software. If you wrote that kind of software, you might have done it in C. If you were on the business side, you were working w/mainframes, or "Basic 4" type business computers, or maybe CP/M or MP/M computers. IBM PCs were mostly still using DOS, and programming on those was often in DBase, FoxPro, Turbo-C, BASIC... etc.
In 1988 I was a 25 yo working on for a 10ish person KP funded start-up that wrote a Mechanical CAD package that ran on Microsoft Windows 3.0. The premise was that PCs would take over, that mini and micro segment would disappear, that VARs would no longer be necessary to sell hw/sw and train people to use apps.
The application was written in C (not C++) for Windows. It took significant parts of an hour to compile (see the XKCD comic on sword fighting during compiles). Some of the demos we'd do would be on the COMPAQ luggable machines. We'd find bugs in the Windows API. We'd write our own object-oriented DB that lived in memory and disk. The "Algorithms" book was 18 years away. Most of the team had been through 6.001 (THE 6.001) and had that as a basis. We had to solve pretty much everything -- no real libraries to drop in. Our initial network had a single 68000 based Sun machine with SCSI hard drives (10MB then 100MB as I recall) running NFS, with PC-NFS on all of the PCs, connected via coax cable ethernet. We used CVS as our source control. We later got a SPARCstation to do ports to Unix, and it was very much a thing to port separately to Sun, Intergraph, and SGI workstations since the OSs were different enough.
The first version took about 2 years (hazy...).
And after you'd written the product on Windows, to get it to RUN well we would write programs to do runtime analysis of typical app usage (watching swaps in and out of memory) to build custom linker scripts to pack code in a way that minimized the amount of program paging in and out of memory, since PCs didn't have much memory in those days. I'd find out a couple of years later this is how MSFT did it for their applications; they didn't tell us, we had to figure this out. Developers were Developers. Testers were Testers. Testing was done primarily with running through scenarios and scripts. We were date driven, the dates primarily driven by industry events, our VC funding, and business plan.
As we got ready for releases, I recall sleeping under my desk, and would get woken up when bugs were found related to "my area." That company was pretty much everyone's life -- we mostly worked, ate, exercised, hung out together, and we were always thinking and talking about "the product." There was this thing called COMDEX that would take over Las Vegas each November, as the SECOND biggest show for that town. The first was still the Rodeo :-). If you were in PC hardware or software, you HAD to be there. Since some of the team members comprised core members of the MIT blackjack team, when we went to COMDEX there was some crossing of the streams.
Design principles? Talk it over with the team. Try some things. I can't recall compensation levels at all.
That company got purchased by a larger, traditional mainframe/mini CAD/CAM vendor, about the time that I was recruited to the PNW.
Things better, or worse than today? That REALLY depended on your situation. As a single young person, it was great experience working at that start-up. It was a springboard to working at a mid-size software company that became a really large software company.
Today, it CAN be more of a meritocracy, since there are ways to signal competence and enthusiasm by working on open source projects, and communicating with other developers. It's easier to network now. It's HARDER from the perspective of there are larger numbers of developers in nearly any area now than ever, and geography just isn't as important. But I also perceive that most people are less willing to make trade-offs like spending extra time today finishing something while it's still top of mind, vs. "knocking off" and doing it tomorrow. That could just be my perception, however.
In 1988 I was a 25 yo working on for a 10ish person KP funded start-up that wrote a Mechanical CAD package that ran on Microsoft Windows 3.0. The premise was that PCs would take over, that mini and micro segment would disappear, that VARs would no longer be necessary to sell hw/sw and train people to use apps.
The application was written in C (not C++) for Windows. It took significant parts of an hour to compile (see the XKCD comic on sword fighting during compiles). Some of the demos we'd do would be on the COMPAQ luggable machines. We'd find bugs in the Windows API. We'd write our own object-oriented DB that lived in memory and disk. The "Algorithms" book was 18 years away. Most of the team had been through 6.001 (THE 6.001) and had that as a basis. We had to solve pretty much everything -- no real libraries to drop in. Our initial network had a single 68000 based Sun machine with SCSI hard drives (10MB then 100MB as I recall) running NFS, with PC-NFS on all of the PCs, connected via coax cable ethernet. We used CVS as our source control. We later got a SPARCstation to do ports to Unix, and it was very much a thing to port separately to Sun, Intergraph, and SGI workstations since the OSs were different enough.
The first version took about 2 years (hazy...).
And after you'd written the product on Windows, to get it to RUN well we would write programs to do runtime analysis of typical app usage (watching swaps in and out of memory) to build custom linker scripts to pack code in a way that minimized the amount of program paging in and out of memory, since PCs didn't have much memory in those days. I'd find out a couple of years later this is how MSFT did it for their applications; they didn't tell us, we had to figure this out. Developers were Developers. Testers were Testers. Testing was done primarily with running through scenarios and scripts. We were date driven, the dates primarily driven by industry events, our VC funding, and business plan.
As we got ready for releases, I recall sleeping under my desk, and would get woken up when bugs were found related to "my area." That company was pretty much everyone's life -- we mostly worked, ate, exercised, hung out together, and we were always thinking and talking about "the product." There was this thing called COMDEX that would take over Las Vegas each November, as the SECOND biggest show for that town. The first was still the Rodeo :-). If you were in PC hardware or software, you HAD to be there. Since some of the team members comprised core members of the MIT blackjack team, when we went to COMDEX there was some crossing of the streams.
Design principles? Talk it over with the team. Try some things. I can't recall compensation levels at all.
That company got purchased by a larger, traditional mainframe/mini CAD/CAM vendor, about the time that I was recruited to the PNW.
Things better, or worse than today? That REALLY depended on your situation. As a single young person, it was great experience working at that start-up. It was a springboard to working at a mid-size software company that became a really large software company.
Today, it CAN be more of a meritocracy, since there are ways to signal competence and enthusiasm by working on open source projects, and communicating with other developers. It's easier to network now. It's HARDER from the perspective of there are larger numbers of developers in nearly any area now than ever, and geography just isn't as important. But I also perceive that most people are less willing to make trade-offs like spending extra time today finishing something while it's still top of mind, vs. "knocking off" and doing it tomorrow. That could just be my perception, however.
I still like working hard.