As we move towards the 2008 U.S. Presidential election, we’re going to hear more and more about the loss of manufacturing jobs. This is important – folk have to be able to eat. However, what the politicians aren’t talking about is the fundamental transformation going on in manufacturing and how that will impact citizens. That is important too. Manufacturing as we know it is going away – the notion that we can just “keep jobs here” is naive at best. To put this into context, let’s first reflect on what the word manufacturing means:
Manufacturing (from Latin manu factura, “making by hand”) is the use of tools and labor to make things for use or sale. The term may refer to a vast range of human activity, from handicraft to high tech, but is most commonly applied to industrial production, in which raw materials are transformed into finished goods on a large scale.
Notice manu as in manual – this definition is in need of an upgrade since so much of what we call manufacturing doesn’t require human hands touching materials. One could argue that that has been true for a century, that we’re actually at Manufacturing 7.0 but I’ll stick with current terminology. Manufacturing 2.0 is a transition phase that will bring dematerialization to the forefront. reBang’s excellent Next Generation of Product Development Tools series is loaded with videos that illustrate how this is happening.
The word “sampling” is probably most often associated with music, but it’s not at all limited to that application. Physical models are sometimes sculpted and their shape digitally sampled, or a previously existing reference might be digitized and used as a scaffold for building a new, virtual model. Or something entirely unrelated can be sampled and turned into a virtual 3D model. Once digitized, there’s not much that can’t be done with digitally sampled information.
reBang: Next Generation Product Development Tools, Part 6
This kind of sampling is at the heart of Manufacturing 2.0 and represents a key aspect of Rhythmeering. When Manufacturing 3.0 arrives on the wings of robotics and nanotechnology, man-made items will be works of art and hobby – there won’t be many of today’s manufacturing jobs here or overseas. Sampling and mashups will be important elements of the new industrial base. The government needs to start informing the people and preparing for this future now.
My last post on dematerialization dealt with developments suitable for experienced engineers, but Bug Labs wants to broaden that to include consumers:
Because everything we’re doing is open source, you are free to make it perfect yourself. You want to change something? Go right ahead. And when you do, we’re hoping you share your improvement with everyone else so we all benefit. It’s why we call our work community electronics instead of simply consumer electronics. We, Bug Labs, don’t own the keys to your satisfaction, you do. And this, in our humble opinion, is how it should be.
Whether it’s multi-core approaches such as the Tile64 or FPGA-based approaches such as the OpenSPARC reported on recently, evidence of the dematerialization trend is everywhere. I recently came upon a site that is at the center of this
In those early days of my career, hardware design was a real man’s game. We designed big boxes with loud fans that roared as if boasting of its impressiveness. Then came ASICs where all of sudden your innovation was miniaturized into something only an inch across. Today, chips are disappearing altogether and the real design work is in IP— making chips and systems are simply manufacturing steps.
The focus of an engineer today is either in creating IP, or assembling others IP into dream fulfilling subsystems. The power and influence of the engineer simply keeps expanding, being able to create larger and larger works from the work of others.
This site should be helpful to people creating solutions requiring devices that interact with the internet.
IBM is pushing its Jazz developer collaboration technology as a research tool and has given money to some universities that are researching how to break down cultural and geographic barriers when developing software…. Three universities were awarded the grants to help drive the software community’s ability to think beyond the individual developer to organizational productivity. The University of California, Irvine, is exploring the use of multi-monitor environments to improve project awareness and development practices. Two other awardees, the University of British Columbia and University of Victoria, both in Canada, are embracing the collaboration capabilities of Jazz and researching software development team interactions and communication.eWeek: IBM Touts Jazz for Research
In its research, the University of California, Irvine is exploring the use of multi-monitor environments to improve project awareness and development practices. To date, software engineering tools are designed under the assumption that they must effectively operate on a single monitor on a developer’s desk. The trend, however, is to equip developer’s desks with multiple, typically larger monitors, and to equip community areas with tiled displays through which vast amounts of information can be shared. This research leverages Jazz technology to explore how software development tools should be (re)designed to take advantage of this extra display space, with a particular focus on project awareness. The Jazz platform provides many hooks and listeners through which the information that the visualizations need can be obtained.
Via Dwayne Lee at Sun the following video of a OpenSPARC T1 chip running on an FPGA
marks another step towards the kind of software-hardware relationship I think needs to develop. Sun’s next generation T2 has also been open sourced and should have a significant impact. There’s also a pdf with design details
For the hands-on, hardware-minded folk, some links from a Squeak hardware mail list discussion on the state of the FPGA-based Plurion project which zoom in another level on the evolution of the hardware-software relationship.
Via comments on an entry about Fujitsu’s 3D image recognition chip, I came upon this Cadalyst article on visualization which like the commenter, points out the advantages of integrating CAD and traditional design visualization tools
The ability to turn a design drawing into a visualization that mimics reality is an invaluable tool for troubleshooting a design, convincing a nervous client or helping to promote a design firm’s capabilities. This segment of the CAD software industry continues to grow and evolve, with rendered visualizations becoming more sophisticated.
While this trend is indeed a positive one, it is fundamentally constrained by an exclusively designer-centric perspective. This view understandably stems from the historical hardware constraints which made customer/end user access to the CAD data prohibitively expensive. However, as can be seen in the high end, proprietary CAD/PLM offerings of Dassault, this tradeoff is not an inherent requirement. In a rhythmeering environment integration is needed for manufacturing, maintenance, supply chain, marketing analysis – throughout the complete product lifecycle. In order to achieve this level of deep integration, the underlying information models for CAD have to become features of the operating system and eventually the hardware. Open source platforms like Croquet point the way.
Along the way to hardware that is really fluid and adapable, software is weaving its way deeper and deeper into the hardware. A couple of writers envisioned the next near-term steps earlier this year:
… virtualization technologies should be pushed down further into the iron and sold in volume. In short, there should be some way to make these technologies a low-cost part of the system, … it should be made in an on-demand fashion, activated with a key for a nominal fee, complete with physical-to-virtual conversion tools and virtual-to-physical tools to undo the virtualization if customers decide to do that, too.
The effect of free, hardware based virtualization which is automatically there would make for very interesting x86 servers. Even more so with a few on-board, fully virtualized, multi-fabric I/O channels. Kind a baby mainframe.
Maybe it’s a crazy idea. Maybe its a vision of the future of computing.
Now, in the aftermath of the VMWare IPO, we’re seeing it unfold.
“With virtualization, where you can run any operating system on top, it seems a lot more logical that it would be effectively a layer sitting on top of a server,” said Illuminata analyst Gordon Haff. “Why wouldn’t it be supplied with the server?”
XenSource announced XenExpress OEM Edition last week, and market leader VMware this week is announcing VMware ESX Server 3i at its VMworld conference. The products run from flash memory built into a server instead of being installed on the hard drive.
The embedded versions aren’t just a fantasy. VMware has partnerships with IBM, Dell, Hewlett-Packard and Network Appliance. “We expect them to begin integrating ESX Server 3i into their servers later this year or early next,” a VMware representative said.
The move has strategic importance in these relatively early days of virtualization, elevating the profile of virtualization specialists’ products. Getting a foot in the door could help the virtualization specialists get a foot in the doors of customers who might be interested in higher-level products to manage the increasingly sophisticated computing infrastructure that can be built atop virtual machines.
Virtualization has been around for decades, but its inclusion in mainstream computers with x86 chips is bringing it out of the shadows. And the money is following.
There are a number of trends interrelated to this many of which are growing exponentially. Kansas has left the building.
Rhythmeering is inherently a multi-disciplinary field so these developments are important steps along the way
UC Berkeley’s dean of engineering is remaking the school’s program to attract new students. He’s mixing nanotech, biology and social engineering into the agenda.
… Berkeley isn’t alone in trying to mix in these types of subjects. Stanford University has opened its own design school. Chipmaker Intel, meanwhile, has hired anthropologists in recent years to get a better handle on how people, particularly in emerging nations, interact with technology.
The Lighthouse project is doing some really nice work displaying software development processes across multiple monitors. They’re using Google maps, 3D and other location oriented approaches to create new kinds of developer tools. Now they’re integrating these tools with IBM Jazz. Sweet.
I’ve long viewed screen real estate as a huge constraint to be worked around. The tasks and ideas in my mind just don’t fit on a single screen. It’s rare for me to use just a single screen or even a single computer unless it’s my laptop outside of my office. Most of the time I’m using 2-3 machines and lately more and more that ratchets up to 4-5. Typically one of the machines has a dual display so at any given time I have 3-6 screens visible.
The volume of information flowing through IBM Jazz or any development environment doesn’t fit on a single screen so it’s a huge productivity drain to constantly search for, then minimize and maximize windows. Virtual desktops help, but then windows are out of sight.