Ask HN: Why did consumer 3D printing take so long to be invented?

80 pointsposted 6 days ago
by superconduct123

Item id: 42079086

203 Comments

cityofdelusion

5 days ago

The pieces did NOT exist in the 1970s. Fast microcontrollers, stepper motors, precision miniaturized manufacturing, reliable and cheap miniaturized DC electronics, and far far more technology was non-existent at any kind of affordable price point. Look at kitchen appliances or metal/wood shop machinery from this era, still heavily analog, mostly made from sheet steel, mostly non-computerized. The 80s would bring better microprocessors but even the simple Nintendo was an inflation adjusted $450. For comparison the first RepRaps used a full power PC as their host machine and their materials cost roughly $1000 in today USD and needed parts from a commercial stratasys machine.

Some of the greatest and most under appreciated technological achievements in the last 40 years have been in materials science and miniaturization.

kragen

5 days ago

Microcontrollers and stepper motors were already controlling 2-D printers in the 01970s and early 01980s at higher speeds and similar powers to currently popular 3-D printers. The first RepRaps did not "use a full-power PC as their host machine"; they were controlled by AVRs, just like Arduino (which didn't exist yet). Generating motor control waveforms on a full-power PC is a nightmare nowadays, and was already a nightmare in 02005. The LinuxCNC people would do it by running Linux under a real-time hypervisor. The first CNC machining was done in the 01950s with IBM 704s, comparable in power to an Intel 8008. The 6502 used by the Commodore 64 to drive its floppy drive would have worked fine, though it is much slower than the AVR and doesn't have built-in PWM generation hardware.

I agree that the pieces did not exist in the 01970s, but the missing pieces weren't the computation.

actionfromafar

5 days ago

Now I'm very tempted to build a 3D printer with an 8-bit home computer and vintage parts.

dcminter

5 days ago

That sounds like a delightful project! This contemporary book from Usborne demonstrates the point that basic stepper controls from an 8-bit computer were well within hobbyist reach:

https://drive.google.com/file/d/0Bxv0SsvibDMTZ2tQMmpyOWtsRFk...

I do think that the limitations of memory (and disk) will require some ingenuity in 3D printing more than the most trivial procedurally defined objects!

Suppafly

5 days ago

That's a great book, it looks so simplistic and then the next thing you know it's teaching you about resistors and soldering and programming, all in like 50 pages of mostly graphics.

TuringTourist

5 days ago

Make your own 8-bit computer on breadboard a la Ben Eater for bonus points

Suppafly

5 days ago

My son is into vintage computers and I'm pretty sure I could make a 3d printer from the old c64s, disk drives, and printers in my basement. The hot end would be the only issue and I'm pretty sure you could rig one up from a glue gun or soldering iron.

stevekemp

5 days ago

Floating point maths would be hard, and fitting the model in 64k would be a challenge - but paging could resolve that, or streaming/paging from disk if you assume something like CP/M.

Logically it's a simple project, though the devil is always in the details!

kragen

5 days ago

Floating-point was ubiquitous in the BASIC interpreters on early home computers. It was just slow. CP/M didn't really do paging, although you could read and write 128-byte records to disk. But CP/M disks were typically only 90 kibibytes, or maybe up to a mebibyte for 8-inch floppies. And a lot of hobbyist home computers omitted disk drives because they were too expensive.

stevekemp

5 days ago

Sure floating point is possible, but if you're thinking of something like a 6502 or Z80 you'd have to implement it yourself - no maths co-processor, or dedicated instructions for it.

In terms of paging I was thinking of paging in later layers of data from disk, perhaps from a large file, perhaps from a series of files. But CP/M certainly did have support for paging RAM, in 3.x, if your hardware supported it.

(My own emulator, and the hardware I have running genuine CP/M are all running 2.x so no paged RAM there. Shame!)

Suppafly

3 days ago

Assuming you're using a pile of old hardware, you likely could dedicate some of it to just doing floating point math.

kragen

5 days ago

Yeah, people did do overlays a lot on CP/M.

If you were using a 6502 or Z80 in the late 70s you wouldn't have to write the floating-point routines; you could just call the ones in the BASIC interpreter you probably had in ROM. They'd still be slow.

As for paging RAM, do you mean bank switching? The 8085 and Z80 didn't have MMUs, so you couldn't do "paging" in the sense people normally understand it today.

brucehoult

2 days ago

The ROM floating point routines in Microsoft BASIC on the Apple ][ (and similar) took on the order of 4 ms for a single precision multiply. BASICA on an IBM PC was only slightly faster.

The FP routines in the Arduino library take about 5 µs for a multiply on an AVR (e.g. Uno).

That's 1000 times faster.

Factors in that include:

- 16x faster clock speed

- typically 1 cycle per instruction vs 3

- 32 8-bit registers can easily hold all the data for operands and result and temporary values, vs Zero-page on the 6502

- single-cycle register to register 8 bit arithmetic vs three instructions needed per 8 bits on 6502

So that's a factor of around 1633 = 144x combined.

All the above apply equally to add/sub and multiply, but also:

- AVR has 2-cycle 8x8 -> 16 multiply instructions, for signed, unsigned, or mixed operands, and for fixed point 1.7 format as well. That's hundreds of clock cycles on 6502 or z80.

Also, the Microsoft BASIC interpreters used a 40 bit floating point format (32 bits of mantissa) instead of 32 bit. That's another factor of 16/9 for multiply.

The 8080/z80 and 8086 have more registers than the 6502, but not enough to implement soft FP keeping values in registers like AVR can.

kragen

2 days ago

Yes, excellent points! But you don't have to do floating-point math to interpret G-code or generate motor control waveforms. You could probably do it on an 8051 fast enough to run a 3-D printer entirely in floating point, but you can also do the math in integers, only going beyond 8-bit precision when necessary.

stevekemp

5 days ago

Bank switching was the term I should have used, thank-you!

aforwardslash

5 days ago

Just buy an old ender 3, they came with stock 8bit cpus :)

kragen

5 days ago

I'm very interested to hear about the results if you do. If I'm wrong, you'll be able to show me how!

wilg

5 days ago

id watch a youtube of this

inglor_cz

5 days ago

An offtopic question: how do you manage to write all the years consistently with 0 at the start (I know why, but how), when everyone around you uses a different standard.

If I decided, IDK, to write "cnow" instead of "know", or "J" instead of "I", I wouldn't be able to do so consistently. Not in a world that massively uses the other word.

Or it would take me twice as much time to double-check everything that I have typed.

smeej

5 days ago

Wait, I can guess at why (somebody's probably still going to be alive in 8k more years), but is that really the reason? In case somebody's gonna care enough to read this comment on HN that long from now? And not realize that of course we didn't bother adding leading zeros? Why not 0001970s in that case?

(I'm sincerely hoping I'm missing something here and am going to look really silly asking this question.)

kragen

5 days ago

Go wild, man. I fully support your right to talk about the 0001970s.

kojeovo

5 days ago

I'd put my money on it flipping back to 0 after 9999

Izkata

5 days ago

From past evidence, I think it's more likely to start a new calendar at 0 somewhere around 4000-5000.

Suppafly

5 days ago

>An offtopic question: how do you manage to write all the years consistently with 0 at the start (I know why, but how), when everyone around you uses a different standard.

Never underestimate what lengths someone will go through to appear to be unique.

layer8

5 days ago

More importantly, why stop at five digits? That seems to be taking quite a limited outlook, and I can already see the Y100K bugs it will cause.

kqr

5 days ago

You get into the habit fairly quickly. I type organise rather than organize despite everyone around me not doing that, but it is legitimately difficult for me to do it the popular way.

gridspy

2 days ago

Organise is the British english spelling. So you're right somewhere!

kqr

2 days ago

Indeed, I picked it up working with Brits early in my professional English learning journey.

kragen

5 days ago

I usually double-check everything I write anyway. I just have to be careful not to "correct" literal quotes that include dates or accidentally talk about the Intel 08008 and the 06502.

esperent

5 days ago

Why just one leading zero?

tdeck

5 days ago

> The first RepRaps did not "use a full-power PC as their host machine"; they were controlled by AVRs, just like Arduino

It's possible that they were referring to the RepRap Host Software, which was RepRap's original slicer.

https://reprap.org/wiki/DriverSoftware

kragen

5 days ago

Oh, I'm sure you're right! Thank you. I didn't realize Skeinforge wasn't the first.

dcminter

5 days ago

It depends how you define "affordable". Daedalus (the late David E.H. Jones) writing in New Scientist in 1974 sketched an idea for a laser-based system and almost immediately received a notice of complaint from an existing patent holder (plus ca change ...) although unlike patent trolls this patent holder had actual objects made by the process.

The system here was based around a minicomputer (or at least a successor patent of 1978 so described it) so we're talking tens of thousands of dollars for the compute involved in that scheme - but that first 1971 patent must have expired in the 90s by which time inexpensive compute was trivially available to match early 70s minicomputer capabilities.

Excerpts from the exceptionally excellent book "The Inventions of Daedalus - A Compendium of Plausible Schemes" which is sadly long out of print:

https://paperstack.com/img/photos/page%2090.jpg

https://paperstack.com/img/photos/page%2091.jpg

nonameiguess

5 days ago

This history is fascinating as hell. I looked up the name of the person in that excerpt who claimed the original patent. He's apparently in the Guinness Book of World Records: https://www.guinnessworldrecords.com/world-records/463935-fi...

I don't know why they have a record for earliest filing of a 3d-printing patent, but this guy (Wyn Kelly Swainson) was an American graduate student of English Literature who filed the patent in Denmark after teaching himself some basic lithography and chemistry from a library after wondering why no technique existed to make copies of sculptures. He ended up doing research for DARPA and founding an engineering company.

Also, for any random reader who isn't familiar with the cultural history of the west, Daedalus was the mythical designer of the labyrinth of Crete made to contain the minotaur, and also the father of Icarus, who crafted the artificial wings he and his son used to escape Crete after King Minos tried to keep them trapped there after he helped Ariadne help Theseus to kill the minotaur. He was also somewhat responsible for creating the minotaur in the first place, as he built the fake cow costume King Minos' wife used to mate with a bull and birth the minotaur. I guess it became a popular pen name in the early to mid 20th century, because it was also the name (Stephen Dedalus) of James Joyce's self-insert character in A Portrait of the Artist as a Young Man and Ulysses.

Suppafly

5 days ago

Sure but they did by the 80s and 90s. I had a cad/cam class in the 90s and the pen plotter we used, not sure which model but HP 7090 from the 80s is similar, had all the technology necessary to build a 3d printer. Hell you could build a 3d printer from essentially first principles using old broken tech and stuff from a hardware store. The first time I saw a 3d printer years ago, my first thoughts was why it took so long for them to happen because the technology was really basic.

kragen

5 days ago

You will probably be interested in reading my summary in https://news.ycombinator.com/item?id=42080682 of the things the RepRap project struggled with in the first years of the project; it explains why it took so long.

Suppafly

5 days ago

>They wasted a lot of time trying to make it work without even a heated bed, to keep costs down.

I'm not trying to downplay the reprap team and their work did lead to some of the innovations in home 3d printing, but I suspect a lot of the project is like this, losing time by ignoring things that already existed and trying to reinvent things using cheap household items by going down unnecessary rabbit holes. Thank god that people serious about actually making a shippable product actually got involved at some point or it'd all still be theoretical.

kragen

5 days ago

That's easy to say with the benefit of hindsight! But before going down the rabbit holes, we didn't know which ones were unnecessary. If you didn't go down any of the rabbit holes, you'd end up with a US$60k machine built around US$5k ballscrews and ABS and whatnot.

The point at which people serious about making a shippable product got involved was already after it was no longer theoretical.

Suppafly

3 days ago

>If you didn't go down any of the rabbit holes, you'd end up with a US$60k machine built around US$5k ballscrews and ABS and whatnot.

Honestly that's the normal way to do product design, start with the expensive version that works and then slowly redevelop it to use cheaper parts, or make big enough purchases that the cost of the parts naturally comes down.

I get that you were involved in the project, and won't hear of any criticism, but people serious about shipping instead of tinkerers would have done things differently, and there is no reason to deny or get upset about that.

kragen

3 days ago

I wasn't; by "we" I meant "humanity".

Evidently the normal way to do product design failed at producing consumer 3-D printers, as it invariably fails at big innovations, so in those cases people have to be tinkerers to be serious about shipping. Clayton Christensen has a book about this you might be interested in reading.

simne

3 days ago

> by "we" I meant "humanity".

Humanity is not one subject. It is multitude of objects and subjects. The best analogy is loose coupled network (each active node have ~10..100 connections, and billions passive objects with 1..3 connections).

Because of this, we could talk only about some subjects, who directly interested in use opportunities and have all need for this activity.

Even if use relatively conservative approach and consider only formally declared countries as subjects, we will have about 200 subjects (plus some active and successful humans - for example, in business usually considered 80% value of company is good CEO). If we look slightly deeper, we will see some trans-national subjects, like G7, G20, ITU, and many regional subjects. And you should understand me right, before We (I and You) create some at least voice agreement, we cannot consider any subject "we".

Yes, exists "invisible power of market", and some other similar things, like libido, but they all are unconscious powers, and you cannot expect for them to do conscious decisions and conscious things.

hagbard_c

5 days ago

Given that it is possible to make a 3D printer out of old dot matrix printer parts and given that matrix printers were introduced in the '70s (Centronics [1] claims its model 101. launched in 1970 was the first impact dot matrix printer) it would have been possible to create 3D printers in the '70s. While the hardware and controlling firmware would have been possible to build back then it would have been quite useless without the software needed to create 3D models to print on the device.

[1] https://en.wikipedia.org/wiki/Centronics

kragen

5 days ago

However, the Centronics interface was designed to allow them to build the printer without including a microcontroller in it. (Microprocessors of any kind were several years off.) It wasn't until the mid to late 01970s that it became common to include microcontrollers in printers.

As for the question of software for 3D models, see my overview of the history of that field in https://news.ycombinator.com/item?id=42080437.

simne

5 days ago

Just for info, Centronics interface is very slow for 3D printer. It could work (as 2D plotters work), but people make figures faster, and 3D made things even worse.

kragen

5 days ago

3-D printers are much lower bandwidth than conventional 2-D printers, and the conventional Centronics parallel port could handle up to "75,000 characters per second", according to https://en.wikipedia.org/wiki/Parallel_port#Centronics. The total amount of data in a print is larger, but typically the 3-D printer is only processing 1–10 commands per second, compared to about 100 for a dot-matrix printer.

But the point I was trying to make was that dot-matrix printers predated the availability of microcontrollers, or even microprocessors, and you needed cheap microprocessors to make hobbyist 3-D printers.

simne

5 days ago

3D FDM printers need much higher bandwidth, because their controller need to make PWM for motors control, and for other things.

Yes, one could make 3D printer without PWM, but it will be extremely slow, or even cannot do some things.

With modern microcontrollers this problem resolved by using interface just for high level commands (all low level control performed by microcontroller), but on early machines used computer as controller and they have to deal with this.

I have some experience with modern FDM and SLA, and I seen many cases, when FDM was severe limited with microcontroller PWM range.

kragen

5 days ago

Right, you need a microcontroller in the printer to avoid having to send PWM signals over your parallel port. But they don't have to be modern microcontrollers. An 8051 or Z80 would be fine, maybe with some external chips for PWM generation.

simne

5 days ago

> An 8051 or Z80 would be fine

May be. Problem is by definition of G-code, which is much more than 8bit (as I remember, there somewhere between 12bit and 18bit if consider just integer numbers, but for example diagonal lines and arcs are calculated with floats). Yes, I know, digital 8bit CPU could calculate 32bit floats or even 64bit floats, but it is slow and need additional RAM. All these calculations are much easier to do on 32bit computer.

> some external chips for PWM

Additional chips are bad for economy by definition - because size of PCB grow and because additional pins mean additional costs also. That is why first designs was dumb, without any controller at all and driven by computer - to avoid additional chips.

In modern designs microcontrollers used for convenience - so now printer could run without working computer (for example from file on flash).

kragen

4 days ago

I agree that 32-bit chips (and decent instruction sets) make everything much easier.

You wouldn't necessarily have to interpret G-code on the 8-bit microcontroller itself, although it's about the same difficulty as interpreting BASIC on it. Keep in mind that keeping the motors of a 3-D printer busy only requires a few speed changes per second, maybe 10 at most. By contrast, the 8051 in an Epson MX-80 printed about 80 characters per second and had to actuate the 9 hammers in the print head with a new set of voltages 9 times per character, for a total of about 700 deadlines per second.

When Don Lancaster was trying to figure out how to build 3-D printers and other "flutterwumpers" in the 01990s, his idea was to use a bigger computer to rasterize curves into a set of one-byte step commands that would be interpreted by an 8-bit microcontroller, for example in https://www.tinaja.com/glib/muse140.pdf, as I mentioned in https://news.ycombinator.com/item?id=42080682. His first explanation of this concept may have been his July 01993 "Hardware Hacker" column in Electronics Now (previously in Radio Electronics and Modern Electronics) https://www.tinaja.com/glib/hack66.pdf where he's exploring the possibilities opened up by Parallax's BASIC Stamp:

> One potential use for the BASIC Stamp is shown in figure four. I’ve been doing a lot of work with the stupendously great PostScript general purpose computer language. In fact, this is the only language I use for all of my electronic design, pc layouts, stock market analysis, schematics, Book-on-demand publishing, and just about everything else.

> All the camera ready figures you have seen here in Hardware Hacker for years have been done by using nothing but my word processor and PostScript. Device independently.

> The only little problem has been that PostScript I/O tends to end up a tad on the skimpy side. Usually you only have three choices: Dirtying up otherwise clean sheets of paper or plastic; writing files to the hard disk; or returning your data back to a host for recording or other reuse.

> The BASIC Stamp can instantly let you extend the genuine Adobe Level II PostScript to any personal project or machine of your choosing!

> Assume you’ve got a homebrew machine that has an x-axis and y-axis stepper, an up/down mechanism, and a "both steppers home" sensor. This can be a vinyl signcutter, engraving, or embroidery setup. Or an automated printed circuit drill, a wooden sign router, or a Santa Claus machine.

["Santa Claus machine" was Theodore Taylor's 01978 term for a 3-D printer; it's what Lancaster consistently called them in his columns over the years.]

> You could use two of your BASIC Stamp lines for RS423 serial comm with your PostScript printer. Use two lines for both x-axis stepper phases. And two lines for those y-axis stepper phases. One line for pen or drill or whatever up/down. And a final line that zeros only when both steppers are in their home position.

> The hidden beauty here is that all of those fancier PostScript fonts and the level 2 tools immediately become available for use on your own custom homebrew rig. At unbelievably low cost. With zero royalties!

Normally when people do diagonal lines and arcs on computers without FPUs, they don't use floats; they often don't even use integer multiplies per step, but rather strength-reduced algorithms like Bresenham's algorithm and the DDA algorithm Bowyer talks a bit about in https://3dprintingindustry.com/news/interview-dr-adrian-bowy....

I agree with you that additional chips are expensive, but in the overall scheme of things, a few 555s to drive your motors aren't going to make the difference between feasibility and infeasibility.

So I think it's clear that we could have done hobby 3-D printers 40 years ago. What similar opportunities are we missing today?

simne

4 days ago

> What similar opportunities are we missing today?

Something You have not done yourself or something for what You have not paid to someone else to do.

Life is complicated, it is very typical when appear opportunity but nobody use it. Because to use opportunity need 3 things:

1. The will. 2. Enough qualification (or enough IQ and time to learn). 3. Free (!!!) resources or cheap borrowed funds.

If only one thing from list is missing, it become extremely hard to use opportunity. For example, I live in Ukraine, and even without war, regulations in country are so prohibitive, that many services are not enter country or enter with severe limitations, like Paypal only enter with private accounts, but business accounts are not available in Ukraine.

Yes, You understand right, idea cost nothing, but ability to implement ideas worth billions if find fertile environment.

And yes, I spent lot of time to find ideas and I could share with You high level information on how to find (or generate) ideas. Just let me know if this theme is interest for You.

kragen

4 days ago

Yes, of great interest! And in particular I'm interested in how to make do in environments full of prohibitive regulations that make it difficult to do anything, because I live in Argentina.

simne

3 days ago

What I learn in Usenet and on early forums, to avoid discuss country regulations, so I will not promise to support this branch of discussion.

But well, as you ask, for about prohibitive regulations, we in Ukraine made (and making) two things.

First, we are practically motherland of anarchy, and was few multiple years periods, when country live without government and without regulations. You could read about deregulation in books of

https://en.wikipedia.org/wiki/Daron_Acemoglu

and

https://en.wikipedia.org/wiki/Ludwig_von_Mises https://en.wikipedia.org/wiki/Murray_Rothbard

also I myself like Ostrom, Elinor (1990). Governing the Commons: The Evolution of Institutions for Collective Action.

Second, when government become powerful, we read prohibitive as it is - we just avoid made business in prohibitive niches. Because of this appear paradox, government and armed forces said that country need rockets, they even allocated large budgets to buy rockets from private companies, but nobody make rockets as they are prohibited.

In some cases, possible to make virtual business or semi-abroad - it is possible to create business in US or in EU (or other countries with adequate regulations) and hire people remote.

As conclusion, I mean, the best way to deal with prohibitive regulations is just flee abroad, to country with more adequate regulations, and you could from abroad support deregulation struggle if you wish. But if it is hard to flee, you could try to open business abroad.

For other opportunities I writing email.

simne

3 days ago

I usually state: for use opportunity, you must run business.

Well, must admit, this is not only way, but most effective.

Also possible to use opportunity in educational environment or at community or in government organization. But from organizational science, all of these are just special type of business - educational business, or business owned by community , or business owned by government.

And yes, many people are practically running business, but don't think it is business. And this is very serious problem for economy, because they use this opinion "not a business" as consideration to avoid vital business transactions, and thus killing opportunities.

kragen

2 days ago

Hmm, very thought-provoking. But doesn't that make your activities more vulnerable to regulation?

simne

2 days ago

Yes and no.

Yes, I agree, providing more info you become more vulnerable.

No, at concurrent market appear more priority matters.

You sure hear about Queueing theory, and may be you know, when execution unit wait for something, this time subtracted from max performance, and in worst case EU could wait 100% of time, so performance will be zero, no matter how fast EU could run ideal case.

How this apply to economy - business do many things without guarantees, only based on trust ("business is all about trust" - you could make big table with this phrase and hang it on the wall as icon).

In low tech business, typical to invest large resources to stored things, so people could just pay money and got product immediately, and do not wait until it will arrive from fields.

In high tech business appear additional dimensions, investments made into r&d and into prototypes of products, before mass production.

Services are just genuine trust, as in many cases you cannot know if service suitable for you before use it, so you should buy small amount of service or service business should spent some amount of it's resources to make gifts to try.

Any way, in many cases, need significant amount of borrowed money to start business or to scale business, and the more trust have subject, the cheaper will be borrow for him.

Statism is usually considered State as intermediary, who in theory should guarantee deals and even borrow cheap money, but this scheme is very limited and vulnerable for not good enough persons, because State itself must got resources somewhere to store them and to have something to share. If state have no spare resources it cannot guarantee anything. Not good persons use lot of tricks to force people to give their resources to State, and in extreme case just use violence to pick up resources from their owners (sure, this is bad for trust and killing free market).

Ideal business as I said before, called for trust. For example exists Islamic banking, which is extremely similar to idea of Venture Fund, and invented because for Muslims prohibited to give money for percents. So they create business with acceptor of money and got share in this business, and involve in this business to control that money will be spent wise.

And the last I should say - some people think, business could grow on internal money, just constantly withdraw some small amount from money circulation and reinvest these money into business grow.

Well, this scheme is working, but exists one problem - many business opportunities are limited in time, exists sort of time window (you could read more on this in ballistics or in space science, they use term as usually exists positions of planets, most energy effective). You could use such opportunity only if you will do things fast enough, so create your product or service before time window closed.

And grow on internal resources is very slow, usually it is just few percents faster then inflation, but to finish inside good time window in many cases need large investments, tens percents of turnaround or even more than turnaround, so without cheap borrowed money business will lost opportunity.

But to borrow money need to show clear enough organization; you may hear about Due Diligence, but I think you are not know, that Due Diligence is usually much more strict and deeper than tax reporting. So conclusion - to have more success in using business opportunities, you must share much more information than to just deal with state regulations.

Things looking even worse if we look on insurance and on finance transactions - typically insurance and finance transactions fees are depend on trust, so very high probability, semi-hidden business will constantly lose few percents of turnaround, and will be even more limited in using opportunities.

More detailed on this you could read in books from Acemoglu, Mises, Elinor Ostrom.

simne

4 days ago

> a few 555s to drive your motors aren't going to make the difference between feasibility and infeasibility

It depends on current market of considered niche. For some cases, market price is so low, every additional hole or trace on PCB could kill economy; for others, could include something like SGI or Cray as controller (mean, something severe overpriced) and it will be still profitable.

dragonwriter

5 days ago

> Given that it is possible to make a 3D printer out of old dot matrix printer parts

Using no other parts, at least none others that would not also have been available in the 1970s?

tdeck

5 days ago

Possibly? This guy made a 3D printer out of wood and unipolar stepped motors salvaged from old typewriters:

https://m.youtube.com/watch?v=yz_DKXIuL8U

However getting the right thermoplastic filament would be a major challenge, and slicing would require a high end mainframe with graphical terminals. Everything would be prohibitively slow. Also regular 2D printers were much more expensive back then.

euroderf

5 days ago

> Look at kitchen appliances or metal/wood shop machinery from this era, still heavily analog, mostly made from sheet steel, mostly non-computerized.

Yup I miss that too. Maybe all the appliances came only in avocado green or harvest gold, but they held together and did not spy.

sottol

6 days ago

I think it's probably correct that few people *really* cared until the Reprap project showed a real DIY 3D printer was possible.

Early machines were industrial machines with huge price tags, proper linear motion systems, complicated extrusion systems and so on. There is a bit of a mental leap to go from seeing $100k+ machine and dreaming to design something that can be built for $200-500.

The problems were: no "reference designs", no tried and true go-to mechanical parts (like cheap chinese linear motion rails), extruders (they were DIYd!) or heated beds (early models were just PCBs) and so on - imo it just took someone to get this rolling and that may have well taken 30 years.

I think Reprap was first publicly shown around 2005. From then on it was taken on by more and more makers and refined. It culminated in the early 2010s hype with Makerbots and its contemporaries but they still cost > $1500 and were far from set-and-forget appliances, like 50% reliable and slow - we had one at work and I got fascinated but it printed at 5-20mm/s so parts would just take forever and often fail due to bed adhesion, clogs, ...

The last 10-15 years then have seen the popularization of 3d printers through the Prusa i3 and its clones (Ender and other cartesians < $300) and steady refinement of reliability through better materials. Then the last ~5years or so significantly bumped up the speeds through better linear motion components, motion systems and input shaping + firmware and ecosystems like Klipper.

Bambu imo got in at just the right time and refined everything that had culminated up to this point into a solid appliance. Imo their genius was more in the industrial design, making it reliable and affordable manufacturing than anything else.

TaylorAlexander

6 days ago

A few notes: The patents were a huge impediment. They did not expire till 2008 - before then printers cost $25k. They were expensive and stratasys had no incentive to make them $300 like they are today so volumes were limited. From their release in 1995 until 2008 stratasys sold like 13000 printers according to their company history web page. For a long time now Prusa has shipped more printers than that in a single month. The Reprap Darwin wasn’t built until late 2007, then makerbot and others followed. I bought an Ultimaker in 2011 and that’s when it feels to me like home printing started to become viable.

Once anyone could build their own design, a community of hackers and engineers formed that continuously improved the designs with diverse ideas and experiments. That community is what made 3D printing what it is today. And it was illegal for them to do all of that (in many countries) until the patent expiration in 2008. That’s a big reason why it took so long. I think it’s interesting to consider whether this would have happened sooner if they had never been patented, though perhaps the expired patent created a legal safe haven where no one could take away the basic principles by patenting them. Anyway, patents play a big role in this story!

Edit: Some cool history here: https://3dprintingindustry.com/news/interview-dr-adrian-bowy...

kragen

5 days ago

Note that Bowyer doesn't mention the patents at all in this interview, so I question your assertion that the patents were a huge impediment to him. Possibly they dissuaded other people from starting a RepRap-like project years earlier, but I think the vast majority of such people didn't even imagine that an affordable 3-D printer was possible.

TaylorAlexander

4 days ago

Ah that’s a fair point. I asked him:

https://bsky.app/profile/tlalexander.bsky.social/post/3laiw4...

I think I have heard that in the country he was living in, violating patents for personal use was okay. My understanding is that things are much more strict in the USA, where I believe you can’t help others violate patents so you can’t publish work the way Adrian did. I know that in the early days of 3D printing everyone wanted a belt printer, but users and hackers on the forum regularly expressed concern over MakerBot’s patent and the associated legal risk of violating it. That’s why today’s belt printers have the head at a 45 degree angle. It’s a patent workaround because MakerBot’s patent specified a belt that was parallel to the motion axes. At least that’s what I think Brook Drumm of Printrbot told me.

So even if they didn’t affect Adrian, they certainly had a chilling effect. And I don’t think even under Adrian’s legal regime he would have been allowed to sell the work, so more expensive engineering development was prohibited. We didn’t get cheap 3d printers until companies could mass produce existing low cost open source designs, and that mass production was obviously prohibited by the patents. We didn’t get low cost machines until competition was allowed in to the space. And I’ve heard directly from 3D printer hacker developers that patents affected their decisions not to prototype certain new components. Sorry I can’t point you to a source directly but this would probably have been on the mailing list for the Bay Area Reprap club in 2010-2014, or the Ultimaker mailing list in the same timeframe.

kragen

19 hours ago

He seems to have responded:

> Under EU and UK law anyone can use a patented technology to research improvements in it without paying royalties. So no, it wasn't any problem.

> This is not the case in the US.

simne

4 days ago

> in the country he was living in, violating patents for personal use was okay

Looks like context missing.

In jurisprudence exists definition of negligible case, meaning, harm is too small to run full-featured juridical machine. Same thing considered for example, when tax regulations ignore tips, because it need much more resources to administer than could gather as taxes.

So, laws usually ignore, when you do something prohibited, but nobody harmed, like if you violate patent but don't tell anybody about this. Unfortunately, this also means, you cannot involve other people or make business on this, so this activity will not scale to level, when we here could talk about affordable 3D printers. BTW, this is opportunity for AI, to use artificial agents instead of people workers, but this is very different context for now, and it is not researched good enough to talk about.

kragen

4 days ago

I'm eager to hear what he says!

The Constitutional purpose of the US patent system is to promote the progress of science and the useful arts, so discussing how to practice or improve patented inventions is definitely legal. (That's why patents are published in the first place.) That's what Adrian was doing. There's no "personal use" exemption in the US law, but there is an exemption (in caselaw) for research, although a court decision around that time narrowed it substantially. It would be impractical to practice a patented invention for such protected research purposes if the law prohibited you from even discussing how to practice it. But in fact you can even go so far as to patent an improvement on somebody else's patented invention.

There is a prohibition on inducing others to violate a patent, but generally it's applied in cases where someone is selling a nominally-non-infringing product whose only real use is to practice the patented invention.

I agree that, at the much later time that MakerBot not only existed but had turned evil, its patents had a chilling effect. I can absolutely confirm that! But at that point the cat was already out of the bag, and the original FDM patent had already expired, so that particular patent no longer had such an effect.

The question I intended to discuss (because in my interpretation it's what superconduct123 was asking) is why we didn't get something like the Prusa Mendel V2 in 01980 or 01990 or 02000 instead of 02010. Obviously the concerns people had about patents after 02010 aren't the reason; if it was a chilling effect from patents, it would have had to be a chilling effect much earlier than that.

And I don't think it was. We could imagine a history where there was a low-cost open-source design in 01990 or 01995 or 02000 or 02005 which companies couldn't mass-produce because of the patent, but that isn't the history we actually got. Why not?

I think the answer is basically that a low-cost 3-D printer sounded like such a crazy idea that none of the small number of people who even understood the concept of a 3-D printer decided to dedicate the effort necessary to make it happen. Also, everything was much more difficult then. I think Adrian accelerated the timeline by 2–5 years.

But only 2–5 years. Fab@home or the (non-FDM!) Open3DP group at the University of Washington would probably have been successful without him. MIT started their Center for Bits & Atoms in 02001, and they started up their first "Fab Lab" that year. EMSL built their Candyfab 4000 in 02006, which was the first 3-D printer I saw actually operating (at TechShop, if I recall correctly). So the memes were spreading.

But basically I think you could have built a hobbyist 3-D printer in the 01980s if you'd realized it was a good idea. Just as you could have built a submarine in the 01600s. The necessary technology was available. It would have been enormously more difficult, especially before the advent of cheap PIC16s in the 90s. Finding like-minded people on GEnie or FidoNet would have been much more difficult than in the blogosphere. 3-D modeling in 512KiB of RAM at 1 MIPS would have been limited. But you could have done it.

The question I'm most interested in here is: what similar opportunities are we missing today?

fragmede

18 hours ago

Think we could have had it by 1980, given enough funding and someone with the gift of hindsight. Not sure as a hobbyist thing, given the still nasecent Internet and the nature of funding. But I say yes, technologically speaking, based on the story of how they got the pre-GPS Etak Navigator to work, which still blows my mind. https://www.fastcompany.com/3047828/who-needs-gps-the-forgot...

Though that has more obvious advantages as a product that consumers want which translates to an more possibility for funding. There's a clear need for a device that gives directions, vs a 3d printer. While I'm very happy with my Bambu 3d printer, it didn't scratch an actual need in my life.

Looking for similar opportunities today, we have access to relatively cheap compute power. A 3d printer in the 80's would not have been all that cheap as a consumer device. Today though, we have 3d printers and relatively cheap (for now) parts to drive them - motors/gears/belts/UCs.

While LLMs have taken up all of the air in the room, they're not there only ML technique out there, and while GPUs aren't cheap, they're fair cheap considering the cost of a Cray, back when they made them.

Looking those things, I wonder the number of parameters you'd need for a system where the AI model is continually being updated as new information comes in. What size pet robot would be feasible, backed by a 4090? 5090? 10090 or whatever it's called in six years? Could you make a robot planaeria that had basic sensors and responded to limited stimuli? Robot drosophila? Could you make a robot snake with cameras for eyes? Of course, where I'm leading is a robot dog or cat with an AI model that learns from input given to it by its operator.

Not practical enough to get much funding given the state of the industry; the market for a $20,000 robot dog is too small. The technology is only somewhat there, or about to be, and I'm sure it'll be a "why'd that take so long" after the algorithms for it are invented/discovered which makes it seem obvious in hindsight.

I'm sure there are similar opportunities (the army rejected a robot pack mule for being too loud, but how fun would that be?) Or a food replicator - there are some 3d printers that do chocolate but nothing with an AMS. or how about a specific type of home robot food machine. How about a machine I put flour and water, a block of cheese, tomato sauce, and a pepperoni into, and out pops a ready to eat pizza or calzone? What else could you make an advanced device for egg ingredients go in, and it cooks a finished product?

Thing is, with the Internet and especially YouTube, I'm sure there's someone trying to build one (I saw the vending machine thats basically that, last time I was in Vegas, but it's way too big and expensive for a consumer product.)

Or a robot arm that plays chess or something else where the arm only needs to manipulate the things in front of it and doesn't need a moving chassis, so you can play chess against someone remotely, without being on a computer. Robot arms have existed since forever but it's only recently that we've gotten enough compute to make controlling them easy for an end user.

I keep going back to robots because parts are 3d printable or source able via Amazon, and since 3d printing's really come into being lately, that's would be the thing to take advantage of, in addition to relatively cheap compute.

Looking just a compute there's a load of stuff that involve helping users automate things on their computer, but that's so obvious a use case that many well funded players are after it. The problem with distributed computing is that the interconnect speed between nodes on the Internet is just too low to make it useful, so since Folding@home, there hasn't been one that garnered the same popularity. But we have this global scale network of computers. After the appropriate algorithms and a use case for that is found, it'll be forehead slappingly obvious in hindsight. Maybe as a better AI for civilization games, which afaik still just give advantages to computer civilizations in order for them to be more difficult. If each computer controller race had their own computer to run on, how would that play?

grogenaut

6 days ago

There's also an economy of scale going on that snowballed a ton. In 2014 I got my first printer off Craigslist. The bom was like $1200. Steppers were like $80 ea, a ramps board without controllers was 80-120, the controllers were 20-40 each, the heated bed was a huge PCB so also not cheap. Cheap PCB houses were nascent at the time. Arduinos we're $50+. Power supplies like 40-150.

Now you can get the steppers from Amazon for $8, a control board with stepper controllers for $20 with a built in 32 bit MCU. At scale if you're building a lot of them those parts are going to be even cheaper maybe even another order of magnitude. For a while it was difficult to actually even comprehend how much cheaper stuff was getting and what that lets you do. And then you see a resin printer for $140 and realize it's a cast-off I found screen one stepper and some extruded parts.

jdietrich

5 days ago

The widespread commercial availability of polylactic acid was also a significant factor. It's one of the few plastics that can reliably print on an open-framed printer. The cheap i3 and Ender clones just wouldn't have taken off if we were stuck with ABS.

numpad0

5 days ago

IIUC, airsoft "biodegradable" bullets were always PLA. They were bought in killos and expended in hours.

Doxin

5 days ago

I think PETG would've also worked. Keep in mind neither of those plastics were rare at the time. The filament form factor was essentially nonexistent though, which makes the whole 3d printing business a lot trickier.

kragen

5 days ago

It depends on what you mean by "rare". Both PETG and PLA were commercially available when RepRap started, but they weren't very widely used. PLA's biggest use at the time was reabsorbable medical implants, for example. I don't know what people used PETG for in 02005. I'm not sure either of them existed in the 01970s.

simne

4 days ago

Nearly all plastic bottles under 10 metric liter are PET (many 10 liters produced from Poly(methyl methacrylate)).

PETG is modified PET, so it depend on point of view - (glass semi-full or semi-empty). Any way, in Ukraine people recycle plastic bottles into filament, which is considered by printer software as PETG.

kragen

3 days ago

Yes, PET is the most common plastic!

Doxin

3 days ago

> PLA's biggest use at the time was reabsorbable medical implants

And in disposable forks that liked to advertise compostability. It's no ABS but it isn't and wasn't a rare plastic. I'm not actually entirely sure how common PETG was in products back then, but it's by no means a new invention.

johnny_canuck

5 days ago

I'm curious how much better the printers are these days as I'd love to get another one.

I had a Wanhao Duplicator i3 in 2015 and found it required a lot of tinkering and calibration every time I wanted to use it. I ended up selling it as it was so time consuming to get everything correctly set up that it just killed any interest I had in it.

nameless912

5 days ago

Nowadays spending 500-1000 USD on a machine gets you something that can print almost anything (within size limitations) out of the box with no calibration. The BambuLab printers are nothing short of extraordinary for quite reasonable prices (the A1 mini is only like 300 bucks, but it's small). And their software stack is good enough that 3D printing is roughly as easy as 2D printing (with the same caveats that occasionally your machine will jam and you'll want to chuck it out a window).

ben1040

5 days ago

I had a Makerbot 2X in 2014 and it required constant janitoring every time I wanted to print.

I built a Prusa MK4 this spring; it calibrated itself and printed a great looking piece right from the get-go. The difference is night and day.

criddell

5 days ago

Check out Bambu Lab printers if you want something more modern. They have several models available for less than $1000 and give good results with little messing around.

I say this as someone who doesn’t want 3d-printing as a hobby. It’s just a tool I want to occasionally use in order to get something else done and the less time I have to spend tramming and calibrating, the better.

ok_dad

5 days ago

I bought the cheapest printer available, the Kobra Go I believe, and with very little tweaking I had it running well enough that I don’t care to tweak more. I owned a delta printer ten years ago which I spent $1200 plus another $600 in custom parts and it ran about the same as my current one (though it was bigger) and my current one cost $150 plus shipping. I don’t suggest the cheapest one if you have the money but I only use mine once a month or less so it was the perfect price vs. functionality. I did build the old one from a kit and modified nearly every mechanism though, so I’m relatively confident with these machines. I do suggest the more expensive, better quality printers if you don’t want to tweak stuff much.

pfych

5 days ago

I had an old Ender 3 kit that was nothing but a hassle from 3-4 years ago. My partner bought a BambuLab A1 & it was insane to see how it "just worked" out of the box. Highly recommend

dv35z

5 days ago

Check to see if there is a community maker-space in your area - including libraries & universities. One of the benefits of using those machines are that they are well-maintained and frequently used & tuned. Also, you can meet other 3D printing experts who can help you with any project - its a good vibe, and a great way to get back into making without a large investment...

artificialLimbs

5 days ago

I got an Ender 3 v3 CoreXY about a month ago. I just about pulled it out of the box and started hitting print. It's almost that easy. It's been printing almost continuously for about a month with very few problems.

PaulHoule

5 days ago

I remember reading articles in hobby electronics magazines circa 1990 where people were talking about 3-d printing as a nascent technology.

LarsAlereon

6 days ago

It was invented but protected by patents, so it wasn't until those patents expired that companies and hobbyists started to experiment. Early 3D printers hadn't figured out things like layer adhesion yet so parts tended to be too weak to be very useful. It wasn't clear that this was an area for improvement rather than a fundamental downside of the technology.

jerf

6 days ago

It's also easy to underestimate how much computation is involved and how little there was in the 70s. The 3D printer slicers generally require several seconds on modern hardware to calculate their final path. The resulting files are in the several megabytes. My Bambu 3D printer is all but festooned with sensors and is constantly adjusting things, but even the simple ones have more to their firmware than meets the eye. Even assuming that they'd use some simpler algorithms and data structures, you're looking at vast computation times, and most likely, vastly simpler objects being printed.

Even something that seems as simple as "sous vide" cooking, which is basically a heater hooked to a thermocouple, took a lot of little innovations to make practical to hand to the masses.

And then, there's the general improvement in motors speed, precision, and cost, along with any number of advancements here and there and everywhere to make it practical.

Could someone thrust back to the 1970s and given a fairly substantial budget make some kind of 3D printer? Probably. But it's slow, extremely expensive, and can print various sizes of plastic bricks and spheres and other very algorithmically simple objects, and in rather low quality without many, many years further development. I can think of many ways of bodging on various improvements, but they're all going to have their own compromises, not be garden paths to what we now think of as modern 3D printing. (For example, someone could bodge together a machine that offsets a rod on the printer head, and then in the offset space, has an object to be "copied", by basically banging into the object with a very crude sensor, so there's no generation of geometry at all. But this is going to be clumsy, inaccurate, and full of very complex and disheartening limitations.) You're not going to be printing Dwayne "The Rock" Johnson's face embedded into a toilet [1] or anything, at 1MB of geometry. It will be commercially useless and inaccessible to hobbyists.

[1]: https://www.thingiverse.com/thing:6436138/files

etrautmann

6 days ago

Also 3D modeling software and hardware to run it.

silvestrov

6 days ago

AirFryers are even simpler than "sous vide cooking". No reason they couldn't have been invented in the 50s - 60s when kitchens had good electrical supply.

creaturemachine

5 days ago

Air fryers are just convection ovens packaged differently.

jdietrich

5 days ago

Sort of, but not really. Air fryers have a much faster rate of airflow than any convection oven, which results in significantly faster and more even cooking. The air fryer really is a meaningful technological development.

Ekaros

5 days ago

Smaller volume, slightly bigger/faster motor. Actually looking at the fan there is nothing special with it.

Temperature control is probably hardest part. But in general air fryer could have been done quite a lot of earlier. Maybe materials for the basket is other aspect.

knowitnone

5 days ago

if you consider adding a fan as "meaningful technological development"

ozim

5 days ago

Well it is. We had electrical ovens without fan that were great. We had electrical ovens with a fan that were great. Air fryer has much more advanced fan with much more power than earlier.

For me this is how meaningful technological progress happens. It is not like someone one day wakes up and suddenly has rocket that reached space caught back on landing by mechanical arms.

ProllyInfamous

5 days ago

I just purchased my first air fryer summer 2023 — it has changed my cooking life, allowing me to rarely ever "eat out" without much meal prep.

"Just adding a fan" is actually an extremely meaningful tech development.

jerf

5 days ago

Oh, there's plenty of technologies that were developed and/or popularized decades after the basic tech stack was available to do it, even under the constraints of "commercially viable". That's an interesting study of its own.

3D printing is not one of them.

kragen

5 days ago

It's plausible that our current 3-D printing design workflow has grown up in the computationally intensive way that it did because all that computation was available; it's easier to just use a least-common-denominator but bulky file format like STL than to worry about the incompatibilities that would result from using more expressive files.

People have been doing CAD/CAM since the 01950s. Boeing started using CNC in 01958 on IBM 704s, and MIT's Servomechanisms Lab (working with the Aircraft Industries Association: https://web.archive.org/web/20090226211027/http://ied.unipr....) sent out CNC ashtrays to newspaper reporters in 01959: https://en.wikipedia.org/wiki/History_of_numerical_control#C.... Pierre Bézier started writing UNISURF in 01968 at Renault, who was using it to design car bodies by 01975. The Utah Teapot was created in 01975, and it consists of nine Bézier patches; you could print the whole dataset on a business card: https://web.archive.org/web/20141120132346/http://www.sjbake...

The IBM 704 was a vacuum-tube machine that could carry out 12000 floating-point additions per second and had a failure about once every 8 hours https://en.wikipedia.org/wiki/IBM_704. The Intel 8008 (not 8088, not 8080, 8008) that came out in 01972 could carry out over 79000 8-bit integer additions per second, which is about the same speed. But much faster computers were already available, such as the PDP-8, in wide use for real-time control, and they very rapidly became much cheaper. Any computation MIT's Servomechanisms Lab could do in the 50s was doable by hobbyists by the 80s.

The reason 3-D printers mostly use stepper motors is that they don't require closed-loop feedback control. 2-D printers from the 01970s used stepper motors for the same reason. They were accessible to hobbyists; in the 80s I had a Heathkit printer someone had built from a kit in the 70s.

If you wanted to print Frank Sinatra's face on a toilet, I think you'd probably want at least a 64×64 heightfield to get a recognizable Sinatra; 256×256 would be better than the line-printer pictures we were doing. 8 bits per heightfield point would be 65 kilobytes, which would fit on the floppy disks we were using at the time. This would have been totally feasible, though digitizing Frank Sinatra would have been a nontrivial project, quite aside from printing him.

So I don't think computation was the limiting factor.

Your "basically banging into the object with a very crude sensor, so there's no generation of geometry at all" is called a "pantograph" and it has been a common way to copy three-dimensional objects and engrave letters with a milling machine for 180 years: https://en.wikipedia.org/wiki/Pantograph#Sculpture_and_minti...

fragmede

5 days ago

Computation for the printer was possible, but home computing definitely was not there enough to let someone model something up in fusion 360 like we think of today. That was the realm of srs workstations like the ones SGI made.

I don't think that's a show stopper though. If we had had 3D printers in the 1970's and 1980's, sold at Sears, between the computers and the tools, you'd bring home a paper catalog from the 3d model company, thumb through it for models you wanted to build, send off a cheque (and a SASE) and they'd mail you back a cartridge that you plug into your printer so you can print that model. And then get that catalog in the mail forever after.

As reference on how much home computing power was available back then, the original Apple I came out in 1976 with a "character generator" and was not capable of advanced graphics - the cost of the amount of ram it would take to draw an entire screen from ram would have been astronomical, so characters were stored in ram and the card was in charge of outputting the characters and rudimentary graphics.

simne

3 days ago

> That was the realm of srs workstations like the ones SGI made

Not exact. First, funny note, Seymour Cray admit, he used Macintosh to design Cray.

Second, existed activity of using Atari ST for 3D design, as sort of companion machine for special model of Commodore, which could process analog video. As I seen, they directly write frame-by-frame on Betacam.

Sure, rendering on ST with 68k was slow even with TV resolution, but they made lot of commercials on TV.

In late 90s, TV commercials made on MMX or Pentium-Pro with 16..32mb of RAM, and use PC frame grabber cards and RAID HDD. And that's history, how SGI become bankrupt - just when Maya was ported to x86 this become end.

kragen

5 days ago

I agree that you couldn't have done Fusion 360 on an Apple ][ or a Compaq Deskpro 386. But see https://news.ycombinator.com/item?id=42087099 for my notes on what kind of 3-D graphics you could have done on hobbyist computers of that time period and how we actually did do 2-D CAD on them.

simne

3 days ago

It was time of Moore law, and computers grow extremely quickly. I remember, I run 3D Max 2 on 486 with 4mb of RAM in 1996.

Models where severe limited, at most few thousands triangles per scene, but it was enough for talents to make crazy things.

fxtentacle

6 days ago

That's also my understanding. The 3D printing boom started when the patents expired.

scotty79

4 days ago

Wonderful illustration of how terrible patents are for technological developments of humanity and for consumers in general.

fxtentacle

3 days ago

You're not wrong. But at the same time, I feel like a patent is roughly the only thing that could stop a wealthy international corporation from just copying my new product and then out-competing me with their generous marketing budget. If the expected payout is large enough, a patent will then also help me find a law firm who's willing to work purely on commission, meaning it causes one of those rare instances where I might be able to meet an international corp at eye level, even if I'm a solo inventor.

scotty79

3 days ago

Try it. It's not. The best case scenario it might make some huge corporation generously throw you some peanuts for your patent so it can use it for protection against other's creativity and to extort them.

Other corps will just use your invention and just skip the markets where you have a patent. And yet their product will find its way there organically.

Many individual inventors that got patents openly talk about how useless and expensive they are.

legitster

6 days ago

What needs would home 3D printing fill in the 70s?

A complete set of woodworking or metalworking tools was a lot cheaper than a home computer. And there were entire magazines dedicated to proliferating free or easily obtained schematics/designs. Labor was also cheaper, and people had more time for hobbies.

I would also argue the point that it would have been relatively cheap. We are used to the ubiquity of cheap DC motors and precision parts being a click away. But if you were to rummage through a vintage Radio Shack to cobble together a home printer, I think you would struggle to construct anything precise enough with consumer available parts.

> a melting plastic

Don't sleep on the chemistry of filament. It has to be extremely precise and consistent. We benefit from the massive economies of scale today, but this was small batch stuff 20-30 years ago. And if we are talking about the 1970s the plastics were really primitive by today's standards.

Legend2440

6 days ago

3D printers were first invented in the 80s. There's a combination of several factors why they took off in the 2010s:

1. Cheap stepper motors and electronics from China

2. Expiration of Stratasys patents in 2009

3. Widespread availability of CAD software and desktop computers powerful enough to run it

4. Reprap project made it easy for companies (and individuals!) to develop their own printers

Onavo

6 days ago

Number 1 is huge, it's also the primary driver of the shift from "model planes" to quadcopter drones with enormous capabilities. The crucial parts were brushless motors and ESCs. The Chinese scale brought the pricing down from ~3-4 figures to under 3 figures which was a watershed moment for hobbyist and commercial use cases.

dghlsakjg

5 days ago

The other big one for drones was sensors and the processing to run them. Trivially small cheap gyro position sensors, accelerometers, GPS, pressure sensors, etc... needed to exist since quadcopters are more or less incapable of flying without a computer in the loop.

syntaxing

6 days ago

A mixture of patent protection, China’s (lack of) precision manufacturing at scale at that time, and software immaturity. LM8UU around 15 years ago when I started my 3D printing journey was upwards of $15 PER PIECE. You need at least 8 for a Mendel or i3 style printer during the reprap boom. Just the bearings alone would have cost you, $100 or so. Then factor in chrome rods, stepper motor, stepper motor drivers (Pololu just released their A4988 boards), you’re looking at almost $1K just for the parts. The software like slicers and even gcode interpreters weren’t even made yet. Merlin wasn’t even a big thing until about 10 years ago. We take for granted how much work the community has put into 3D printers to get us where we are today.

regularfry

5 days ago

I don't think you can underestimate the development of efficient slicing algorithms. It might just have been something that would have happened anyway with other parts of the story in place, but there was a point in time when slicing anything complicated was very painful.

tjoff

5 days ago

You don't start with anything complicated though. You start with cubes, make fixtures etc.

And with waiting 5 hours to print it isn't unreasonable to wait an hour or two for the slicer either.

kens

5 days ago

You could buy a 3-D printer from Stratasys in 1990, but it cost $178,000! The printing unit was $130,000 and the Silicon Graphics workstation with slicing software to run it was $48,000 more. So the technology was there, but two orders of magnitude too expensive for the consumer market. Patents account for part of the cost, but the computer to control it was also way too expensive, as well as the custom (vs commoditized) mechanical components.

Link: https://books.google.com/books?id=0bqdMvDMv74C&pg=PA32&dq=st...

Animats

5 days ago

Have you ever seen 1970s CNC gear? I came across a 1970s CNC lathe in a surplus store once. The thing was the size of a small car. The electronics box was the size of two racks. Inside were cards that handled one bit of one register. Input was from a paper tape. No CPU; all special purpose electronics.

Directly controlling industrial machines from a microprocessor was very rare before the 1980s.

snakeyjake

5 days ago

3DBenchy.stl, the little boat that everyone prints as a benchmark during 3d printer reviews, is 10MB.

A 10MB hard drive cost $3,000-4,000 in 1980.

That's $12k-15k today.

Just opening the .stl file and having it render (USABLY) on screen in high-resolution was probably not economical until the late 1990s-early 2000s.

I am used to computing tasks being human-perception instant. It takes tens of seconds to run repairs on 3d models, which means it would have taken tens of hours to do that same thing, if there was even enough RAM, in the 90s.

cyberax

6 days ago

> All the pieces existed to make a working 3D printer existed even in 1970! and relatively cheaply. So why has it taken so long for 3D home printing to actually become a thing?

Easy. The printing process itself is not that hard.

It's the model _design_ that is tricky. We needed home computers to become powerful enough to run 3D CAD software, and enough people to get proficient with it.

RepRap started in 2005. Realistically, we could have had it maybe a few years earlier. But not _much_ earlier.

the__alchemist

6 days ago

Some observations that I didn't see in the other comments:

- Home 3D printing is often more of a hobby than a traditional prototyping or engineering discipline. People view it as a skill to have, and a fun use of free time. Note how the cheapest and most finicky ones are popular; they can be made to work well through careful operation, troubleshooting, procedures, customization etc. They are not set-and-forget, and I think the userbase likes that.

- Home 3D printer parts (the motors, frames, electronics etc) are almost exclusively sourced from China. We live in an AliBaba world; that wasn't always the case.

whatusername

5 days ago

I'm not sure how true that first statement is anymore. Most of the recommendations I see now are "just get a Bambu labs one". We are much closer to 3D printing as a utility as opposed to 3D printing as a hobby than we were ~3 years ago.

the__alchemist

5 days ago

Great news! Given that time frame, my info is indeed out of date.

superconduct123

6 days ago

That's a really good point

Those AliExpress clone kits were really popular in the community in the beginning

nkrisc

5 days ago

When I started my Industrial Design degree in 2007 the workshop there already had several large, commercial 3D printers for student use.

Mind you they were nothing like the tabletop consumer ones we have today. They were about the same of a large American refrigerator.

Since it was not really any special or amazing for us to have several of them, I have to imagine that industrial 3D printing capabilities were well established by the point.

Edit: as I recall they were mostly used to make parts which could be given a nice surface finish and then from which silicone molds could be made.

jkingsman

6 days ago

I don't know that I'd argue that "consumer" 3D printing, as in a printer at home that just churns out a part when you want it, is even really hear yet, certainly not in the way that a dishwasher or lawnmower is. You need to do your own slicing, thinking through supports and brims and layer height, and the printers themselves need no small amount of troubleshooting that is much more than "turn it on and off again". So, I'd argue it's still a hobbyist realm than a consumer one.

Legend2440

6 days ago

The big limiting factor IMO is CAD skills - otherwise you're just printing parts off thingiverse.

OJFord

6 days ago

Don't underestimate the number of people that do that though.

jimnotgym

5 days ago

Until they have a little boat in each colour... and put the printer in the back of the garage to collect dust.

CAD skills are essential, and it turns out not as hard as you might have thought!

OJFord

5 days ago

I do, I've never printed one of those boats, but not everyone's interested.

If you think of it as functional/decorative categorisation first of all, obviously some people will overlap but broadly speaking I think people are interested for one or the other, then within the 'decorative' camp you can go a hell of a lot further without and I think it's more obviously reasonable to not care about designing your own models. You never wanted to design your own toys, but there's appeal in printing things not available on Amazon, unofficial merch for a film you like, or whatever.

Not to say there isn't functional stuff (which I exclusively print) on these sites, but often it won't be quite what I want, so yeah I end up in Fusion. (And typically starting from scratch eventually, because for some reason people don't share source, and working with imported STLs is hellish.)

dghlsakjg

5 days ago

Resin printers are HUGELY popular for people that paint miniatures and do tabletop stuff. Most of them never design characters and just buy the cad models to print.

mrguyorama

5 days ago

There are several "CAD lite" systems available if you don't actually need dimensional accuracy though. There's a model boom in DnD circles around sharing 3D models, slicing them up, gluing them together, and making your own designs by basically digital kitbashing.

My friend is filling up hard drives with 3D models DMs share.

artificialLimbs

5 days ago

Bro maybe 3-5 years ago. I pulled my Ender 3v3 out of the box and hit print and it's been running almost constantly for a month. I don't even know what 'bed leveling' is, because it self levels, whatever that means.

jkingsman

3 days ago

I also found my printer (Elegoo Neptune 3 Pro) to be flawless for about a month until it started needing more conscious levelling and bed cleaning between prints, started getting fussier about temperatures (ended up replacing a possibly faulty thermistor that was making my bed temp fluctuate), and dealing with waterlogged filament and needing to solve that with filament driers etc.

I'm not saying it /can't/ be smooth, but I contrast it to my dishwasher, which I expect to wash dishes, correctly and successfully, about 99/100 dishes (assuming I rinse them well), and then when it starts to decline in 3-5 years, that it's time for a new one. 3D printers are not to that level of reliability yet, nor are their support systems around slicing/supporting/etc.

ThrowawayTestr

5 days ago

Just FYI, bed leveling is fine tuning the distance between nozel and bed and making sure the bed is planar to the x and y axis.

paulorlando

6 days ago

Patents do explain some of this delay....

Fused Deposition Modeling or FDM (1989, expired in 2009), Liquid-Based Stereolithography or SLA (1986, expired in 2006), Selective Laser Sintering or SLS (1992, expired in 2012), metal processes like Selective Laser Melting (SLM) and Direct Metal Laser Sintering (DMLS) (1996, expired 2016).

vel0city

6 days ago

One thing to think about is you'd need quite a beefy machine to even properly render the gcode in the 70s and 80s. In this age of printers computers often didn't even have fonts, they were ROM cartridges your printer would take. And even into the 90s there were accelerator cards for printing documents and talking to the printer in alternative ways than parallel ports.

So really, for an average hobbyist the idea of a 3D printer controllable from a home PC wouldn't really be possible until like the mid 90s. So you really need to start your look at why it wasn't a thing at some point in history I'd start the digging there, not the 1970s.

kragen

5 days ago

If by "render the gcode" you mean "interpret the G-code" then, no, G-code dates from 01963 and is easier to interpret than the BASIC that home computers used in the 01970s. If by "render the gcode" you mean "produce the G-code" then it depends on the geometry you're producing G-code for, and the algorithms you're using to do it, but potentially being able to do it overnight gives you a lot of flexibility there. People were doing computer numerical control with the APT language already in 01959.

vel0city

5 days ago

I mean "render the gcode" as in going from a 3D model to the gcode you want your tool to follow. Following the gcode would be generally pretty simple; I agree.

However, I doubt most home hobbyists had computers in their house in 00000000001959 capable of manipulating 3D graphics in a meaningful way that would really be approachable to a general hobbyist. Wikipedia suggests 1960s CNC's were often controlled by a PDP-8 or similar minicomputer. When the PDP-8 first came out in 00000000001965 they cost $000,0018,500, almost $000,180,000 today. I dunno about what kind of home you grew up in but a machine like that wasn't exactly a home PC available to spend all night rendering the output for a CNC machine which probably also several thousand dollars. And that's just the computer driving the CNC, not even thinking about the machines and knowledge it took to actually code the designs and curves and 3D patterns.

I'm well aware of computerized CNC machines from at least the 80s. I had family who owned a machine shop. They were not really hobbyist accessible things.

kragen

5 days ago

We were talking about "the 70s and 80s", not 01959.

My point about APT is that a "beefy machine" in 01959 was the embedded controller in your keyboard by 01983; it wasn't a beefy machine any more. The PDP-8 was indeed pretty common for CNC control in the late 01960s, and it could run about 333000 12-bit instructions per second, which is about half as fast as the original IBM PC. So, yeah, a machine like that was exactly a home PC by the mid-80s. For real-time control of the 3-D printer, you can get by with less.

There were three big problems for manipulating 3-D models for home hobbyists in the 70s and 80s.

One was computational speed: for games, you need to render the 3-D graphics fast enough to provide an illusion of immersion, and with a few hundred thousand instructions per second (and no multiplier) you were limited to a handful of polygons. Like, typically about 20. See the Apple ][ FS1 flight simulator https://youtu.be/lC4YLMLar5I?&t=93, the 01983 Star wars arcade game (at 2:02), Battlezone (2:14), and Ian Bell and David Braban's Elite from 01984, which was successfully ported from the 2MHz 6502 in the BBC Micro (2:53, but also most of the rest of the hour of the video) to the Z80-based ZX Spectrum (3.5MHz but noticeably slower than the 6502; see https://www.youtube.com/watch?v=Ov4OAteeGWs).

For producing G-code, though, you don't need to be able to handle all the 3-D geometry in 50 milliseconds. You just need to be able to handle it overnight. That's a million times longer, so you can do a million times as much computation on the same hardware. You can't handle a million times as many polygons, because you don't have enough storage space, but you can represent geometry in more expressive ways, like Bézier patches, nine of which made up the Utah Teapot I mentioned in https://news.ycombinator.com/item?id=42080437, or parametric or implicit equations, solids of revolution, CSG, etc.

You do need some kind of user interface for seeing what you're designing that doesn't require waiting overnight to see the results, which I think is what you mean by "a meaningful way that would really be approachable to a general hobbyist". But it's possible I have a different concept of general hobbyists than you do; as I remember it, home computer hobbyists in the 70s were constantly writing things like

    2120FORQ=1TO4:IFLEFT$(R$(Q),1)=O$THENRC=Q:ST=ST+2*Q:DX=DX-2*Q:NEXTQ
and writing programs in assembly language. And machining metal in a machine shop has been a popular hobby in the US for at least a century, using user interfaces that are more demanding than that. So I think hobbyists would have been willing to tolerate a lot of demands on their mental visualization abilities and relatively poor user interfaces if that was the price of 3-D printing.

But the user-interface issue gets us to the second problem hobbyists had with 3D in the 70s and 80s: display hardware. The limited displays of the time were hopelessly inadequate for displaying realistic 3-D. The Star Wars arcade cabinet mentioned above used a vector CRT in order to be able to do high-resolution wireframe, because framebuffers were far too small. With color palettes of 2–16 colors, even Lambertian flat shading was nearly out of reach until the late 80s.

Again, though, this is much more of a problem for games than for 3-D printing.

My first experience with CAD was on an IBM PC-XT with a 4.7MHz 8088, roughly five times faster than the BBC Micro Elite is running on above (https://netlib.org/performance/html/dhrystone.data.col0.html). I was using AutoCAD, a version whose 3D functionality was inadequate for any real use, and the machines had two video cards—a text-only MDA (basically just a character generator and some RAM) and a 320×200 CGA, on which I could see the actual drawing. Redrawing the whole CGA was slow enough that you couldn't do it after every drawing or erasing operation, so erasing a line or arc was instant, but would leave holes through the other lines it had crossed. Until you issued the REDRAW command, which took typically 1–5 seconds for the fairly simple mechanical drawings I was doing. (Remember that 2-D side-scrolling games were impossible on things like the CGA because you just didn't have enough bandwidth to the video card to redraw the whole screen in a single frame.) Zooming in and out also suffered from a similar delay, but was very necessary because of the low-resolution screen.

Once I finished the drawing, I would plot it out on a CalComp pen plotter, which was basically a 3-D printer without the Z-axis. This had enormously higher resolution than the CGA because it didn't have to contain RAM to hold all the lines it had drawn; the paper would remember them. Some hobbyists did have HP-GL pen plotters during this period of time, but it wasn't very common. I had one at home in 01990. My dad used it to make multicolored Christmas cards.

This sort of approach allows you to provide a real-time interactive user interface for modifying geometry even on a computer that is far too slow to rerender all the geometry every screen frame. It would have worked just as well for 3-D as it did for 2-D, which is to say, it would have been clumsy but adequate.

But the third problem was, as you said, the knowledge. Because the 3-D printers didn't exist, hobbyists didn't know the algorithms, they didn't know the vector algebra that underlay them, they didn't have the software, they didn't have BBSes full of 3-D models to download, etc. That would have developed, but not overnight.

vel0city

5 days ago

> it's possible I have a different concept of general hobbyists than you do

Yes, we're talking about very different levels of hobbyists here. I'm talking about the people who are frequently doing 3D printing today. They're often not thinking about the actual algebra and Bézier curves and what not. They're playing around in CAD software extruding surfaces and connecting vertices and what not, if they're even going that deep. They're downloading pre-made models off Thingverse and printing them with their printer, tinkering with the physical aspects of their printers. Lets try printing this model in this material, lets mess around with this fill, what if we print it at this angle, etc. They're not digging deep into the code of how the CAD application actually works to draw the curves. They absolutely wouldn't be hand typing out the functions to draw The Rock's face on to a toilet, letting it render all night long, going back to tweak their math, letting it render all night long, rinse and repeat.

> on an IBM PC-XT with a 4.7MHz 8088

> the machines had two video cards

A pretty high-end machine for many home users (~$5k new in 000001983, almost $16k today). I realize there were even higher end machines out there though.

> a version whose 3D functionality was inadequate for any real use

Exactly my point. And when would that machine have come out? The PC-XT released in 000001983. So even in 000001983 decently high-end home computers with $001,000+ (of 1982 money, >$3k today) software suites weren't very suitable for even basic CAD work without many pitfalls and limitations.

So general home users who barely passed highschool algebra and don't have degrees in mathematics trying to use 3D printers wouldn't really have the compute capacity to think about what they're trying to make and modify without getting really deep into it, at least until about the 90s. The 3D printing scene as it exists today pretty much couldn't exist until the 90s. Sure, very smart people with piles of cash could have done it in 000001980, potentially even the mid 000001970s, but not like a go to the computer store, buy all you need for <$000,500 (about $000,060 in 000001970s money), plug in to your cheap $000,200 (about $000,020 000001970s) computer, dial into the BBS, grab some files for The Rock's face on a toilet from your BBS, and start printing in an afternoon.

Novosell

5 days ago

Why are you giving your numbers leading 0s?

vel0city

5 days ago

It's to showcase their superior thinking over four digit year neanderthals. A very iamverysmart thing, as if people don't know after the year 9999 there will be a 10000.

Which is why I'll always prepend even more digits when replying, because I'm thinking even further ahead than the short-sighted five digit people and even bigger numbers all the time.

https://longnow.org/

kragen

5 days ago

HN comments are not the right place for personal attacks, and especially not personal attacks for being too nerdy.

082349872349872

4 days ago

00why 0stop 00with 0numerics? why 00not 00prepend 0zeros 00to words 00completely 0at 00random? 0After all, 00if 00you're 0going 00to outnerd 00a 0nerd 00you 0should be 00commited 00to 0the 00schtick.

This 00is 0Hacker 00News, k?

kragen

4 days ago

0000000I 0approve 0000this 0message.

(But wouldn't outnerding a nerd require not just vel0city's smug sense of superiority, but also demonstrating more than a de minimis level of knowledge or technical skill?)

082349872349872

4 days ago

000oops 000sorry 0for 0my 0hand, I'm 0not 0using 0iambic 000or 000any other 0aids*. 0You, 0are 00059 0from 0000here.

* If GenAI is good at poetic forms, could it be asked to generate sentences with a particular mono- vs multi-syllable word pattern? Or would this search better be done in Prolog/with a TTS db?

defrost

4 days ago

Tr0cheeGPT (if you don't mind leading stress) will get you fl0wing like P0e: https://www.youtube.com/watch?v=j8Z0VynTR84

082349872349872

3 days ago

DactylGPT 4o mini fails to count to six, let alone meet the other constraints of a double dactyl: https://en.wikipedia.org/wiki/Double_dactyl#:~:text=Long%2ds...

If I'd attempted to use this to generate an irregular pattern, it'd probably have wound up producing messages decoding to something like "Sammwich Hewix"...

082349872349872

3 days ago

Direct ways offer an intermediate process coding a message. Unfortunately doing it oneself is astonishingly glacial; using a linebreak algorithm prolly would somehow help, nonetheless dyads & monads come slowly; difficulty besets the coding.

082349872349872

4 days ago

MC Poe going to need a gastric rinse / After the glou-glou; he hitting that 'sinthe!

vel0city

4 days ago

> but also demonstrating more than a de minimis level of knowledge or technical skill

HN comments are not the right place for personal attacks. :'(

I'm being attacked for my neanderthal inability to understand five digit numbers :( my primitive understanding of time can only really comprehend two digit year values. Today is the year 1924, right? I failed to write my calendar application to accommodate such high year values as your superior intellect is used to reasoning about. Here I thought ISO 8601 was over engineered, turns out it didn't go far enough. We're all just too simple-minded compared to other big-brain bretheren. May we all learn from our five digit betters.

I'm sorry I'm just not smart enough to be at your level :'(

Izkata

5 days ago

A single 0 just throws me off because it makes me think of octal.

jeffbee

6 days ago

It was the polymers that needed to be invented. In the 1990s we used to order quick-turn prototype 3D prints for parts to check fit before committing to tooling for hard materials, but we only had 3 days to work with it before it fell apart into a puddle of goo.

foxglacier

6 days ago

I don't think that's a reason. PLA and ABS were invented before the 1990's. What was this self-destructing material, and why?

kragen

5 days ago

A surprising number of popular plastics are chemically unstable to one or another extent.

foxglacier

4 days ago

But not 3 days! That sounds like food or ballistic gel, which is also food, I guess.

kragen

4 days ago

Well, not the formulations you buy in off-the-shelf products. But in many cases those require extensive purification and stabilization in order to be suitable for such use. PVC in particular was discovered decades before anyone figured out how to make products out of it.

dekhn

6 days ago

I don;t think anybody was really doing FDM/FFF until the 90s and it didn't really take off until the 2000s. It was quite expensive, required a high level of expertise in both mech E and CS, and existing subtractive methods (like CNC and lathe) were very effective.

K0balt

5 days ago

The magic ingredients were inexpensive microcontrollers, massively paralleled mosfet transistors (also a product of large scale integrated circuits, since they are actually millions of tiny mosfets working together), and the expiration of several key patents that made it possible to have a commercial explosion around the reprap scene.

The patents expiring was a big deal, since the main patent was on the fused deposition process itself.

The other factor was that normal desktop computers had become powerful enough to run sophisticated 3d modeling programs and make machine motion computations from 3d design files.

jeffreyrogers

5 days ago

There weren't good CAD tools to send the designs to the machine until recently. Early CAD tools mostly produced drawings and it was a separate step to manufacture from there.

The way inkjet and laser printers work is also quite different from the way a 3d printer works. The similarity is mostly in the gantry, so there was nontrivial innovation required here.

To some extent 3d printing is probably also a reaction to decreased access to domestic manufacturing. It doesn't make a lot of sense to produce most parts in plastic if you can get a cast or milled part quickly and cheaply.

evoke4908

5 days ago

Part of it is technological advancement. It wasnt until the last 15 years or so that embedded processors became cheap and powerful enough to run a 3D printer at consumer prices.

There's also the problem of 3D modeling and slicing. Again up until quite recently, 3D CAD was out of reach for most consumers. Either due to hardware capabilities or cost of the software. Slicing is its own entire branch of 3D processing and it took time to develop all the techniques we use today that make it fast and reliable. Slicing software could only exist after the printers were common.

As well, I expect the availability and materials science of the plastics we use needed some further development.

As I recall, 3D printers rose to prominence at about the same time and speed as we started getting genuinely powerful personal computers. You really needed a fast CPU, and printing became more accessible as the early I5/I7 generations became cheaply available.

While you absolutely could build an FDM printer with 80s technology, I don't think it could ever be practical or affordable. Even if someone invented all the computational techniques for slicing, the compute available back then was not even close. It would literally take an actual supercomputer to slice your model. It'd take many, many hours on any consumer computer. This would hold true until the early 2000s. At a random guess, I'd say the tipping point would have been around the Pentium 4.

So, same as most technologies we take for granted these days. Enabled almost exclusively by the speed and capacity of computer available to consumers.

simne

5 days ago

Working 3D printers existed long time, but only SLS and very expensive.

Only in 2000s appear things important for cheap 3D printer:

1. Cheap power semiconductors.

2. Cheap 32-bit microcontrollers.

3. Powerful computers, to run slicer, and free slicer.

4. For photo-sensitive resin appear large 2nd market of powerful semiconductor lasers and light-modulators (from DLP projectors).

To be more exact, FIRST powerful transistors appear in 1970s, but become cheap in 1990s; microcontrollers similar, but before 2000s cheap 32-bit does not existed; microcomputers become enough powerful for 3D printing in late 1990s and appeared home CNC motion; lasers are still grow, they now have something like Moore law.

So to conclude, yes, sum of technologies need for cheap 3D printer appeared around Y2K, but then few years spend by fans to construct practice equipment and make first shipment to customer (Prusa made his design around 2009).

Ccecil

4 days ago

1. Patents 2. Cheap CAD 3. Arduino (IMHO...before this it wasn't easy to interface to USB/PC for anyone who wasn't EE) 4. Surge of DIY online (Forums, mailing lists, instructables, pinterest) 5. Reprap leading the way with recycled parts as well as cheap imported motors and controllers. 6. Lack of interest- Machinists could usually do the same thing but better using subtractive. Additive was seen as "useless" for real world items even after reprap.

There are probably a couple other things I am missing...and what order you place my list probably doesn't matter. It took everything at once happening.

simne

4 days ago

> 3. Arduino (IMHO...before this it wasn't easy to interface to USB/PC for anyone who wasn't EE)

I seen CNC of late 90s, which use x86 desktop computer as controller, and in Usenet stored lengthy talks about which boards are suitable for this task (as they made PWM by software on LPT port).

Sure, LPT was not connected to stepper motors directly, but using opto-coupling.

To be informative, Arduino is used to protect from copy printer by Chinese, as directly driven via LPT is not protected at all, but on Arduino you could set read-protect bits for mc flash and it will be not easy to read it (even when Western hackers regularly show read protected flash, it is not regularly happen). ..And from marketing view, it is good, that printer with Arduino could work from flash card, without working computer near.

Findecanor

5 days ago

In my experience, it really took off when PLA filament became commonplace. ABS and UV-cured epoxy emitted noxious fumes which required separate ventilation systems and weren't really suitable to be used at home.

Early home 3D-printers also required more of the user. It took a lot of tweaking to make them produce decent prints.

eigenvalue

5 days ago

Just having a computer that could handle working with relatively complex 3D models, not even visually but just via the terminal, would have been a tall order in the 80s. Even getting the data from the computer to the printer would be tough. Floppies wouldn't store enough data for even modestly complex 3D model files. Parallel port? Would be painfully slow. But even if you get around those issues, the algorithms for transforming the 3D model into 2D layers that could be printed with the proper support structures and with minimal material used are extremely complicated and involved a lot of intermediate data during the calculations. Most machines then wouldn't have had even close the amount of RAM needed to do that in any reasonable amount of time.

scyzoryk_xyz

5 days ago

My father was an industrial design professor and I remember him showing me a 3D printed element in the early 00’s. He explained at the time that it’s the future but just way way too expensive and unstable for actual use cases. Chicken or egg sort of thing - I agree that it took the internet to exist

mmmlinux

5 days ago

1. Patents. 2. CAD software was HUGELY expensive for a long time. 3. Parts were expensive. economy of scale on things like linear bearings made this crazy. I recently took apart my first gen printrbot (it cost me ~$600 and was one of the cheapest ones at the time) and it was all brand name VXB bearings. because that's basically all that existed.

cynicalsecurity

6 days ago

Are many people going to buy a home 3D printer? It's not a profitable business.

JohnFen

6 days ago

You don't have to have a large market in order to have a profitable business. That said, the market for 3D printers at home is larger than many assume, especially as printers get closer to being simple one-button-push kinds of things.

They will probably never be the sort of thing that exists in every home, but they could very well be the sort of thing that exists in every home workshop.

Legend2440

6 days ago

Huh? There are millions of people with a 3D printer in their home. They're so cheap now that people buy them for their grandkids.

kragen

5 days ago

It took that long because nobody was working on it, because it wasn't obvious that a low-cost 3-D printer was feasible.

The 3-D printers you're seeing today are basically the series of RepRap designs, named after famous scientists who studied self-reproduction: Darwin, Mendel, and Huxley. The RepRap project, which started in 02005, is the reason this happened. For the first several years, it was about half a dozen people: Rhys Jones, Patrick Haufe, Ed Sells, Pejman Iravani, Vik Olliver, Chris Palmer (aka NopHead) and Adrian Bowyer. The last three of these did most of the early work. Once they got it basically working, after many years of work, a lot of other people got involved.

There were a series of developments that had to happen together to get a working low-cost printer. They had to use PLA, because the plastics conventionally used (mostly ABS) had such a high thermal coefficient of expansion that they needed a heated build chamber. They had to design their own electronics, because Arduino didn't exist. They had to figure out how to build a hotend that wouldn't break itself after a few hours. They had to write a slicer. They had to write a G-code interpreter. They weren't industrial engineers, so they didn't know about Kapton. They wasted a lot of time trying to make it work without even a heated bed, to keep costs down. They improvised leadscrews out of threaded rod and garden hose. They made rotational couplings from aquarium tubing. Lots and lots of inventions were needed to get the cost down from US$60k to US$0.3k, and lots and lots of time was wasted on figuring out how to get the resulting janky machines to be reliable enough to be usable at all.

Starting in the mid-90s, Don Lancaster was excited about 3-D printers, which he called "Santa Claus machines" https://www.tinaja.com/santa01.shtml, when he could see that they were possible. He wrote lots of technical articles about building what he called "flutterwumpers": "low cost machines that spit or chomp". https://www.tinaja.com/flut01.shtml. For example, in https://www.tinaja.com/glib/muse140.pdf in 01999, he describes Gordon Robineau's low-cost PCB drill driven by MS-DOS software over a serial port, with a schematic. (The fishing-line cable drive sounds imprecise, since this was years before Spectra braided fishing line.) But nobody listened. I don't know if he ever built so much as a sign cutting machine himself.

Journalists like to talk about the patents, maybe because they're legible to nontechnical people in a way that difficulties with your retraction settings aren't, but when I was obsessively reading the RepRap blogs in the period 02005–02010, I can't recall that they ever mentioned the patents. They were just constantly hacking on their software, fixing their machines, having then break again after a few more hours of printing, and trying new stuff all the time. I don't think the patents even existed in their countries, and they were researchers, anyway, and generally patents don't prevent research. Maybe there's a vast dark-matter bulk of for-profit hackers who would have gotten involved and started up profitable consumer 3-D printing companies before 02005 if it hadn't been for the patents, but who never got interested because of the patents.

But what I saw was that businesspeople started commercializing RepRaps once the open-source RepRap hackers got them to work somewhat reliably. Before that, they mostly weren't thinking about it. After that, most of them spent a lot of years shipping very slightly tweaked RepRap designs. Josef Prusa got involved in the RepRap project and redesigned Ed Sells's Mendel, and everybody copied him, and he famously started selling it himself, very successfully. https://reprap.org/wiki/The_incomplete_RepRap_Prusa_Mendel_b... And more recently Bambu Labs has apparently gotten the machines to be much easier to use.

simne

4 days ago

> Journalists like to talk about the patents, maybe because they're legible to nontechnical people

Journalists usually repeat what said experts, unfortunately, not always include enough context to understand.

Patents context is global West, where patents are extremely powerful, so nobody could violate patent and make business on this. And as I know, in many cases people scared even to talk about something patented, and to do something scared like from hell.

On global East (example ex-USSR), patents are mostly harmless, but also don't exist all other economy infrastructure, need to make big things (I mean free market, open borders, powerful VC funds, powerful industry).

superconduct123

5 days ago

Its interesting to me that a lot of products the R&D comes from some big company

But for at home 3D printers it seemed like it was the hobbyists who did most of the R&D, then the companies came in later

kragen

5 days ago

Generally, as in this case, a lot of the R&D comes from academia, and from the open source community. For-profit companies aren't very good at stuff that's far out, so they generally free-ride on open-source efforts like Linux, Python, the Web, and RepRap.

Palomides

5 days ago

not talking about patents at the time was intentional, nobody was ignorant of the situation

kragen

5 days ago

I think you should elaborate. Were you at the University of Bath at the time, or at Catalyst IT Ltd. in Auckland?

scotty79

4 days ago

Memory?

We take 1GB for granted. In the old time is 4kB was expensive achievement.

And you need a lot of memory to store g code for useful 3d model.

iamgopal

6 days ago

Short answer is, there is no need. Personal computer were sold to make nice printed paper. Here there is no specific application.

declan_roberts

5 days ago

The answer to this is the same reason why flying cars aren't ubiquitous. Inventory and discovery is often the easy part.

lawlessone

6 days ago

Is it actually better than homes built by other methods?

The most expensive part of home is general the land it's built on.

jillesvangurp

6 days ago

It's actually that and dealing with permitting. The actual problem of creating some kind of durable, comfortable, shelter isn't all that hard from a technical point of view. Getting permission to build a house though is. Society kind of forces you into debt for a mortgage just to have a place that you can live in. The alternative of paying high rent is not a great alternative as you permanently lose the money.

There are all sorts of restrictions and rules that create this artificial scarcity. Even something as simple as buying a plot of land and parking a trailer on it is not legal in most places except in designated trailer parks. You can get a trailer for next to nothing. And lots of people live in them. But try finding a place where it is legal to put one down and live in one. If it were legal, lots of people would do that. Land plots are scarce and once you have one, you can't just do what you want with it in most places.

I'm just using trailers as an example here. Think prefab buildings and raw material cost. This isn't rocket science. We've been building shelters since the stone age.

There are of course good arguments for this to be made in big cities because of a lack of space. But it's equally frowned upon in areas where there's plenty of space.

cynicalsecurity

6 days ago

He means just regular 3D printers used at home. But yeah, the title is confusing.

IIAOPSW

6 days ago

Nonetheless the misinterpreted question is also interesting. The technical hurdles people are citing for why small scale 3d printers didn't happen earlier are completely different than those of the large scale "home printers". Why didn't we have automatic concrete wall making machines much earlier?

regularfry

6 days ago

The bits of home printing that a machine can do aren't the hard bits, except for very specific buildings where the wall shape is odd, which there's limited demand for. You end up where the choice is a complicated, failure-prone robot in a hostile environment vs a human or three who can do the job quicker once you factor in machine setup.

superconduct123

6 days ago

Ya I've updated the title to consumer to be clearer

h2odragon

6 days ago

Micro-stepping controller chips.

Before that, the precision available without gearing and feedback wasn't sufficient. There were systems but they were order of magnitude more complicated and several orders more expensive.

regularfry

5 days ago

I don't think it needed microstepping. The first reprap board bit-banged H-bridges with a PIC, and even later boards only used A3982 drivers. Microstepping helps, but it came later.

You can look at early calibration settings descriptions and they're still talking about e.g. "The number of X stepper-motor steps needed to move 1 mm for the PIC."

foxglacier

6 days ago

How do you reconcile that claim with the existence of cheap consumer paper printers and hard drives?

observationist

6 days ago

Microcontrollers and software for consumer PCs and the like could have been produced, probably, but there are a lot of areas of deep specialization. You'd have had to bring together all sorts of different disciplines and technologies the old, hard way - universities and libraries and researching manually. The internet allowed all of those things to coalesce easily, and novice level people gained access to high quality information and research at the click of a mouse.

The patents, compute, research access, and dozens of other relatively small barriers created a thicket of challenges and no obvious way to reconcile them, even if you had all the right ideas and inspiration. I think the internet would have been needed in order for all those ideas to come together in the right way.

h2odragon

6 days ago

paper printers only needed such accuracy in one axis of motion, and had gearing to provide it.

hard drives use voice coils, a completely different technology. The circuitry that does that evolved and certainly influenced the creation of microstepper controllers: the neat trick they do is treat the stepper motor as a voice coil in between full steps.

flimsypremise

6 days ago

I have several 2-axis microscope stages from the 80s/90s that are driven by brushed motors with position feedback, and they are all capable of higher accuracy than any stepper motor I have. The capability was there, it was just pricey.

Hell, CNC machines existed back then too.

kragen

5 days ago

At the time, hard drives used stepper motors, but didn't use microstepping. Paper printers like the MX-80 used stepper motors too, it's true, but didn't use microstepping either. Gearing makes your step size smaller but adds backlash, so it can be the enemy of precision; position feedback like current inkjet printers use is much more precise.

eesmith

5 days ago

"cheap consumer paper printers and hard drives" was not a 1970s thing.

I mean, towards the end of the decade was something like the ImageWriter, which let you do bitmapped graphics, as a row of 9 dots at a time. At https://www.folklore.org/Thunderscan.html?sort=date you can read about the difficulties of turning it into a scanner. (Like, 'We could almost double the speed if we scanned in both directions, but it was hard to get the adjacent scan lines that were scanned in opposite directions to line up properly.')

The LaserWriter wasn't until 1985 or so. My first hard drive, 30 MB, was a present from my parents around 1987.

By the 1996, laser-based 3D printing based on cutting out layers of paper was a thing, available for general use in one of the computing labs in the university building where I worked.

The result smelled like burnt wood.

When I visited a few years later they had switched to some other technology, and one which could be colored, but I forgot what.

slantyyz

5 days ago

The Thunderscan, for the time, was pretty awesome though. I remember borrowing one from a classmate to make some scans. Given how we keep a document scanner in our pocket these days, the whole notion of sticking a scanner into a printer seems so antiquated and kinda crazy.

Ccecil

4 days ago

Microstepping mostly is just to reduce noise and vibrations.

The motors tend to fall to the nearest full step when loaded hard. Most people I have discussed this with believe there is little resolution gain (if any) past 4x microstepping but it certainly is a lot quieter to use x256.

refulgentis

6 days ago

Computers in 1970 were so slow we hacked in saying lightness, a scientific measure of color, was the average of the two highest color channels.

al2o3cr

5 days ago

Equivalent statement: "A flying car is just a car with wings, what's so difficult about that???"

spacecadet

6 days ago

I knew someone who recycled paper printers into a home brew 3D printer and then a 2D shopbot style CNC.