“Trusted computing” (TC) is another modern commercial computer tool. Just what does that mean? The dictionary presented definitions such as “have faith in,” “rely on,” “have confidence in” and “count on.”

The concept and need for “trusted” computing started with our own federal government. In 1985, the Department of Defense issued, “Trusted Computer System Evaluation Criteria,” which provided technical hardware/firmware/software security criteria and associated technical evaluation methodologies in support of the overall ADP (automated data processing) system security policy. They addressed their “trusted computer system” and outlined what security features manufacturers needed to build into their new and planned commercial products that would satisfy their trust requirements for sensitive applications.

Now you've seen trusted computing show up again, as it evolved out of the Trusted Computing Platform Alliance, a multivendor effort to define a hardware profile for secure systems in 1999. Later, in 2003, the alliance was transformed into the Trusted Computing Group (TCG)-an industry consortium including IBM, Microsoft, Sun and Intel. The TCG develops and supports open industry standards for TC across many different platform types. They incorporated and have a patent policy because they want to develop open standards.

A general description of what they are working toward is making Internet commerce less cumbersome and therefore more efficient and commonplace by increasing the security of networked personal computers. The basic system concepts are more positive identification of each client and their operators and secure encryption of all data between the consumer and the point of sale.”

There are two schools of thought about TC. One definition refers to increasing the security of networked personal computers at a high cost and “trusts” the networked computers to do the controlling of authorities. Privacy is not an issue here. It's where a computing platform is provided so you cannot tamper with the application software and where those applications can communicate securely with their authors and with each other.

The other approach of achieving computer security is to have your own secure computing platform so your users cannot perform what they are not allowed to perform, but can perform what they are allowed to. Users/individuals actually do the controlling.

So how does one provide users with a more secure system to do their everyday work, specialty work or when operating their entertainment devices?

1. In the TC approach where the hardware and software handle the security, you put your “trust” in the system.

2. In the “Other Approach” where the user/organization designs security features into their own system so they themselves control, they “trust” their own system.

The Trusted Computing Group's definition of security seems arguable-machines built according to their spec will be more trustworthy from the point of view of software vendors and the content industry, but will be less trustworthy from the point of view of their owners.

The TC approach

TC does not allow the user to mess with the application software. This may have first come about because of the desire to run DVDs on the PC but not copy them; music could be downloaded but not swapped. This concept seems to be rejected by many in the industry, but makes sense as a business decision-expand the technology to address more than just controlling DVD copying and grow that into controlling almost everything-the operating system, applications, media and even documents.

Trusted computing can also protect application software so that unlicensed software won't operate and the application software will actually run better, while a computer could be “trusted” regarding its hardware and software configuration.

A separate device called a Trusted Platform Module (TPM) chip could ensure the program's data was protected, that programs didn't interfere with each other and that their code was not modified. A PC or other device that had a TPM chip would be able to prove to other machines on the network that it was enhanced with this technology. A TPM chip could include these functions:

o Secure I/O-A secure path between the computer and devices, such as the keyboard and screen. Programs are prevented from getting access to what's being typed or displayed in other programs.

o Memory curtaining-The programs are kept from reading or writing each other's memory.

o Sealed storage-Protection for private information by encryption by using a key derived from the software and hardware being used. This disallows a virus from reading your private information.

o Remote attestation (the most debated)-Changes to your computer could be detected by you and others so you could avoid having private information sent to or important commands sent from a compromised or unsecured computer. The computer hardware generates a certificate proving it wasn't tampered with. The other side of this coin is that computer owners cannot have the software changed, even with their full knowledge or consent.

These functions can all work together. For example, a secure I/O protects the information as it is entered on the keyboard and displayed on the screen, while memory curtaining protects while its worked on, sealed storage protects it when it is saved to the hard drive and remote attestation protects it from unauthorized software.

The DRM movement and TC similarities

Let's go back to where TC is similar to what was required in the Digital Rights Management (DRM) Movement and see where they are alike. To download a music file, there are rules for how you can use that music:

1. Remote attestation is used to only send that music to a music player that enforces their rules.

2. Sealed storage prevents you from opening the file with another player that doesn't enforce those rules.

3. Memory curtaining prevents you from making an unrestricted copy of the file while it's playing.

4. Secure output prevents you from capturing what's sent to the speakers.

These same “rules” are under the umbrella of trusted computing.

TC criticisms

Criticisms of TC include the following:

o TC requires new hardware to be added to today's PCs.

o The sealed storage doesn't tell the difference between virus software and useful programs.

o The remote attestation disallows unauthorized changes to a software and disallows the user to “masquerade” as a different browser to obtain information only available through a certain browser. It fails to distinguish between acts that protect computer owners against attack and applications that protect a computer against its owner.

o If you upgrade computers, sealed storage could prevent you from moving all music files to the new computer. You might be forced to buy the music again.

o Memory curtaining could just be a software rewrite but appears impractical to some.

o It would be much harder to run unlicensed software. The unlicensed software would be locked out.

o TC applications will work better with other TC applications.

o If you are renting software and don't pay the rent, the software could quit working along with the files it created. This means if you stop paying for upgrades to a certain player, you could lose access to all the songs you bought using it.

These problems occur because trusted computing protects programs against everything, including the user. “Owner override,” where protections are bypassed and the PC no longer gets used against the owner, has been suggested as a simple solution. This, too, has its bad side because some people could tell their computers to lie.

One of the concerns brought out at Brown University's 32nd IPP Symposium on the Trusted Computing Group: Goals, Achievements, and Controversies (put on by their Computer Science Department, March 25, 2004) was about the remote attestation feature where powerful third parties, such as banks, could end up requiring their customers use certain applications produced by specific vendors, thereby limiting competition. See www.cs.brown/edu/industry/ipp/symposia/ipp32/home.html.

The “other approach”-physical security

This involves setting up a secure system yourself to protect you or your organization from the risk of theft, data loss and physical damage. Here are some basic attributes of physical security that can be implemented: (Remember, though, that a security system is no stronger than its weakest link.)

o Have secure access (card access, biometrics) for the company employees and the data center.

o Restrict what programs can be run or downloaded.

o Use cryptographic techniques to protect data in transit between systems from being intercepted or modified.

o Use strong authentication techniques to ensure end-points are who they say they are.

o Physically keep the computers away from non-users.

o Don't let strangers connect to your computer (also can be done from a Web site).

o Use complex passwords and change them often.

o Hire a trustworthy system administrator.

o When using encryption, use off-line storage for the decryption key.

o Keep virus scanner's signature file up to date.

o Only do business with Web sites that have a privacy statement you agree with.

o Disable your cookies.

o Avoid indiscriminate Web surfing.

Above all, it's important to recognize that security is a series of moves and countermoves. Have good security awareness and exercise sound judgment; never let your guard down. Also keep in mind that sometimes attacks can come from within.

In the end, it's up to you

Where this becomes important to you is when you are considering the finer aspects of security for a client. There have been rumblings that users will be pushed into buying TC-compliant hardware and software just because they may dominate the market and will be needed to communicate and to conduct business. Other input is that this would probably not happen-things will just work better.

o TC will need to be looked at very closely-you may have to purchase new hardware and software from TC purveyors.

o The other method of added physical security will involve an analysis of a company's security needs before purchasing the appropriate devices and/or software.

The difference between the two types is that one has the user putting his/her trust in a “system” devised by someone else, and the other leaves security prevention up to the user. Both approaches fit a very basic definition of trust where you almost take it for granted everything will be safe and secure-as long as you purchase the right hardware or the TC hardware, chose the right software or the TC software to get the results you want, pay attention to all the performance standards that apply, and protect your system in the best way possible.

Once you have made the decision which way to go, stay vigilant and manage the network or your PC. EC

MICHELSON, president of Jackson, Calif.-based Business Communication Services and publisher of the BCS Reports, is an expert in TIA/EIA performance standards. Contact her at www.bcsreports.com or randm@volcano.net.