This entry summarizes major topics that drive the computer and consumer electronics industries. It is intended for newcomers to the field who want a brief summary of hot topics as well as relevant history. When there are significant changes, this entry is updated.|
Smartphones - Truly Personal
In the summer of 2008, Apple introduced the iPhone 3G along with its application platform. People using the phone for a while have come to realize that it is, in fact, their first "true personal computer." Since they have the phones in their possession so much of the time, the ability to also do e-mail, ask (literally) Google for information, access any Web site, many of which are geared to the mobile market, play music, videos and games, as well as have access to tens of thousands of applications of every imaginable kind changes the paradigm as to what truly is a "personal" computer. In addition, reading text on a tiny screen is easier than many thought, especially when presented in a narrow column, the newspaper format people have been using for hundreds of years. What's made all this viable is excellent screen resolution.
The iPhone phenomenon unleashed numerous touch screen phones on all the other mobile platforms, sometimes offering more features, such as multitasking, larger screens and support for Flash movies. The two most compelling platforms now are the iPhone and Android with everyone else trying to catch up. See iPhone, Android and smartphone.
Web 2.0 is turning the Internet into a global computing platform for publishing information and running applications. "User-generated content" is a highly touted aspect of Web 2.0, in which anyone can publish anything in a blog, social networking site or wiki. See user-generated content and social networking site.
As applications coming from the Web increasingly have the performance, look and feel of traditional applications that previously had to be installed in the user's computer, Web 2.0 also refers to running more applications from the Internet. See Web 2.0 and cloud computing.
Increasingly becoming mainstream, cloud computing refers to using third-party Internet providers to host an organizations Web sites and services, as well as using Web-based business software from application service providers (ASPs) on the Internet. See cloud computing.
People are hooked on getting e-mail, stock quotes and up-to-the-minute news no matter where they are. The latter part of the 1990s witnessed a huge increase in wireless communications, which continues unabated. Wi-Fi hotspots have become extremely popular and either function as a convenient way to access the Internet or work like a mini cellphone system, letting users roam between buildings in a large complex with their laptops (see wireless LAN).
GSM was the first cellular technology to support data, and all other cellular systems followed suit. Cellular 3G modems in laptops compete with Wi-Fi networks with the advantage of coverage almost everywhere. Wi-Fi hotspots may be much faster than 3G, but are not ubiquitous. See cellular generations and wireless glossary.
Web services refers to linking two parties together over the Web such as buyers and sellers or seekers of information and the information itself. They use Web protocols to provide the transport mechanism so that requests can be made and answers retrieved. However, what gives Web services huge potential on the public Internet is the UDDI system, which is used to register a service so that any inquiring party can automatically discover it and then exchange information.
Web Services uses the XML markup language for defining the text and data structures that are exchanged. However, the real hard work is agreeing upon the description of the data they plan to pass back and forth to each other. See Web services and SOA.
Nothing in the computer/communications industries ever came onto the scene with more momentum than the World Wide Web on the Internet. The simplicity of the Web's hyperlink, an address that points to another Web page on the same server or on any server in the world, spread like wildfire.
As it embraced e-commerce, every company rethought its strategies for sales and customer relations. Practically every software product was affected, and every application was reworked to deal with the Internet in some manner. Now that the Internet is available on billions of smartphones, access to Web-based content is even more ubiquitous. With video streaming and voice over IP (VoIP) growing daily, the Internet has become the global communications network. Combine them with the increasing use of Web-based applications and the myriad opportunities arising from being able to look at and operate anything from anywhere, the Internet has become the backbone of the high-tech world. See Internet, intranet, World Wide Web, cable Internet and IP on Everything.
Throughout the 1990s, new information systems were developed for Windows-based PCs connected by local area networks (LANs) rather than the central computer architecture of mainframes and minicomputers. Although mainframes perform an enormous amount of daily processing in organizations, the trend in the late 1980s and early 1990s was to migrate older systems and develop new ones to client/server architectures. Client/server primarily means storing the database management systems (DBMSs) on the server and using the client workstations to access them (see client/server).
A major incentive for downsizing to LANs was the wide availability of client/server applications and sources for purchasing PC hardware. Client workstations were mostly Windows-based PCs, and the servers were PCs running a version of Windows, NetWare or were a Sun, HP or IBM server running Unix and eventually Linux.
Although hardware costs may have been less than minicomputers and mainframes, many organizations discovered that maintenance costs for client/server architectures were considerably higher than expected. Along came the Web, and the client part of client/server became the Web browser, which provides a platform-independent, universal interface for accessing data and running applications. Client/server systems, which replaced "legacy" mainframes, began to fall under the legacy umbrella themselves if they were not upgraded to use the Internet in some manner.
Networking is the lifeblood of an organization's high-tech infrastructure. Local applications combined with Internet applications and services continue to increase traffic and place heavy demands on the network. In addition, tying networks together when companies expand or merge is a daunting task for network administrators and IT managers. Since the mid-1990s, three networking trends have taken place: #1 - replacing Ethernet hubs with Ethernet switches, #2 - developing higher-speed backbones and #3 - switching to TCP/IP as the standard communications protocol.
Ethernet switches increase capacity by giving each pair of users the total bandwidth. They also allow for virtual LANs, which make network administration simpler. Network backbones are being upgraded to Gigabit Ethernet and 10 Gigabit Ethernet. TCP/IP, the protocol of the Internet, has become the standard transport method for local area networks (LANs). See enterprise networking.
Groupware and Collaboration
Groupware is software that lets users share and collaborate. The pioneering software was Lotus Notes, GroupWise and Microsoft Exchange, which included e-mail and tools such as document sharing and group calendaring and scheduling.
The Internet brought groupware into focus. Fueled by the ease with which HTML pages can be created and shared, organizations routinely publish millions of Web pages on their internal Web sites (intranets) with data extracted from corporate databases. Groupware evolved into Web 2.0 tools such as the wiki, which lets anyone edit what someone else writes.
As collaborative data grow, problems surface however. What happens when documents are distributed to remote servers? Which ones are the latest? Who keeps them up-to-date? What starts out as a simple method of electronically publishing internal documents winds up becoming a strategic information system requiring the same care and attention as the data processing systems deployed for decades. See groupware and collaborative browsing.
The price of hardware continues to plummet. Each year, we get more computer per dollar than we did the year before. A full-blown Windows PC can be purchased for under $1,000 in 2011 U.S. dollars and entry-level machines for under $500. When compared to a few years ago, technology seems to be a huge bargain. However, hardware costs are misleading, because requirements continue to increase as rapidly as costs decrease.
In almost every company, a user is attached to an internal network that is ever expanding. More Internet usage requires faster connections. The general complexity of networking means more inhouse expertise or third-party consulting. Although there is a vast amount of off-the-shelf software for myriad requirements, even the smallest organizations have special needs. Custom programming ranges from $75 to $150 an hour, and consultants cost $150 to $300 an hour. Add up a few weeks of third-party people time, and the cost of a PC looks like chump change.
This adage has been used in the computer field for decades but tells only one side of the story. Hardware may be cheap but custom programming and consulting are not.
End of hot topics.