Jump to content

Talk:Trusted Computing

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Former featured article candidateTrusted Computing is a former featured article candidate. Please view the links under Article milestones below to see why the nomination was archived. For older candidates, please check the archive.
Article milestones
DateProcessResult
August 11, 2006Featured article candidateNot promoted

Weasel Words

[edit]

The problem with this article is that it has weasel words. Here's some examples.

2nd paragraph

  • "Advocates of the technology" Who?
  • "Opponents believe" Who?
  • "which to critics" Who?

3rd paragraph

  • "A number of" How many?

The nature of trust

  • "security experts define" Who?
  • "Critics characterize" Who?
  • "While proponents claim" Who?
  • "critics counter" Who?
  • "Advocates of" Who?
  • "Proponents of trusted" Who?
  • "There is an amount of" How much?
  • "it is suspected that" Speculation

There's still more, but this is enough to warrant the {{weasel}} tag.

if someone wants to help this article , all this statements can be transformed in citations from the"proponets" and "opponents" links in the external links section Dbiagioli 12:00, 12 November 2006 (UTC)[reply]
made some corrections . i'm removing the weasel tag for exception 2 of http://en.wikipedia.org/wiki/Wikipedia:Avoid_weasel_words#Improving_weasel-worded_statements:"holders of the opinion are too diverse or numerous to qualify" Dbiagioli 20:07, 17 November 2006 (UTC)[reply]

reasons for proposed merge of Trustworthy Computing

[edit]

please see my reasons and discuss at Talk:Trustworthy_Computing#reasons for proposed merge. ObsidianOrder

Wave systems

[edit]

I didn't put the link in originally but I replaced it, they are actually pretty central to this field, they own many patents in the area and have developed things like trusted keyboards &ct. --Gorgonzilla 20:41, 8 June 2006 (UTC)[reply]

That may be, but there isn't really any useful information there that isn't already in the article. It's a corporate website with various commercial offerings, and there is no obvious notability since Wave Systems Corp. is not mentioned in the article. Since Wikipedia is not a link directory, I say there is no reason to further enlarge the External links section with this link. Haakon 20:47, 8 June 2006 (UTC)[reply]
I think I'd side with Haakon on this one. Some of the other links could probably do with being re-examined too as the article is somewhat external link heavy at the moment. --Boxflux 06:32, 9 June 2006 (UTC)[reply]

Given the current article which equates trusted computing with the TCG which is in turn a commercial consortium it is somewhat difficult to understand that position. Not that I can quite see why having made the decision TCG is the only game in town that there should be a separate article. I admit I have not done a lot in TCG and only attended one meeting, the very first. But there are a lot more games in the trusted/trustworthy space. --Gorgonzilla 00:52, 10 June 2006 (UTC)[reply]

Italian version

[edit]

Wow, check out the Italian version of this article, which got FA status. I don't speak a word of the language, but perhaps there is an opportunity here to improve the English-language version of the article? -/- Warren 18:03, 15 July 2006 (UTC)[reply]

hello , i'm one of the contributor to that article and i could also translate some pieces , but it will take some time .. also , my english isn't perfect .however , i'll do my besr Dbiagioli 15:08, 19 July 2006 (UTC)Dbiagioli[reply]
translations form italian is almost done . Dbiagioli 13:38, 12 August 2006 (UTC)[reply]

POV-check

[edit]

There appears to be a heavy bias towards a paranoid viewpoint espoused by a small minority in the "Disputed issues" section which lends undue credit to fearmongers. This portion of the article should either be substantially shortened (and "what-if" clauses removed) or the responses of competent professionals who have denounced such myths should be added. —Preceding unsigned comment added by 71.98.89.174 (talkcontribs)

One would expect a disputed section to include heavily disputed topics, and to be characterized as heavily disputed or (depending upon viewpoint) "paranoid" or "naive". Could you list specific examples rather than generically stating that it's heavily biased and/or paranoid. It would help other editors to determine whether there is any substance to the concern and the scope of what you feel is untoward. Right now I'd like to review that section but I have no idea what exact examples you want me to look at. Perhaps you could research the companies concerned, and their responses to these issues, and add the balancing viewpoints yourself if you feel up to it, or at least list what they should be. FT2 (Talk | email) 22:08, 24 July 2006 (UTC)[reply]
I agree with FT2. And besides, I think that the level of paranoia in the article is healthy. I will not use it to protect myself, how can I be sure that the hardware manufacturers will not put asymmetric backdoors in their random key-generators? See kleptography. —Preceding unsigned comment added by 193.11.232.248 (talkcontribs) 21:01, Aug. 17. 2006 (UTC)
I agree with FT2 too -- in fact I think the disputed view is underrepresented. You would be easier to take seriously if you didn't couch your answer in exactly the language the proponents are pushing. For one thnig plenty of "compentent professionals" are explicitly referenced in that section. n —Preceding unsigned comment added by 172.203.199.18 (talkcontribs) 19:07, Aug. 10. 2007 (UTC)
Dear 71.98.89.174, many people think our Wikipedia: Neutral point of view policy implies that we need to remove highly biased statements from Wikipedia articles -- Wikipedia:Neutral point of view/FAQ#Lack of neutrality as an excuse to delete. However, the NPOV policy actually says the opposite -- "bias is not in itself reason to remove text ... Instead, material that balances the bias should be added, and sources should be found per WP:V."
I agree that "the responses of competent professionals ... should be added.". Please feel free to do so, or at least give us some clues as to where we might find these responses. --68.0.124.33 (talk) 02:59, 31 May 2009 (UTC)[reply]

space punctuation???

[edit]

I hope I'm not screwing up someone's legit system, but I'm going through and removing a lot of non-grammatical spaces (i.e. a space before a punctuation mark or double/triple spaces b/w words. Sorry if this is any problem --Gbinal 01:44, 15 August 2006 (UTC)[reply]

Proposed owner override for TC

[edit]

The section on "Proposed owner override for TC" seems to be just another disputed issue. Some people think that it is a good idea, and some people don't. I suggest putting it with the Disputed Issues. Also, it is written with a very anti-TC POV. It complains that the TC folks have refused to scrap a feature in order to please the anti-TC folks. Yes, that's right, just like they disagree on other issues. It should just describe the dispute. Roger 01:29, 23 September 2006 (UTC)[reply]

frankly , one could also say that remote attestation is a bug as it is of little use to the average PC user , if any ... the TC proponents say that it's not a bug , it's a feature .. however Owner Override is now in the disputed issue section . Dbiagioli 06:17, 23 September 2006 (UTC)[reply]
Thanks. The section still has too much of an anti-TC POV. It refers to "problems" and "solution". To those who think that attestation is a feature, there is no problem and owner override doesn't solve anything. I suggest deleting the whole section, except to say that some TC critics have suggested an owner override in order to give owners more control over their own computers, at the cost of making attestations less meaningful. It could just refer the Seth Schoen article for details. Roger 08:50, 23 September 2006 (UTC)[reply]
I think that deletion is never a good idea . perhaps you could add to the section the reasons for which TC proponents think that owner override is a bad idea ? Dbiagioli 09:52, 23 September 2006 (UTC)[reply]
I have attempted to make the owner override section more factually correct and I have tried to remove (anti-TC) bias. — Preceding unsigned comment added by 80.215.156.78 (talk) 10:32, 15 February 2016 (UTC)[reply]

Universal Computer

[edit]

Add a section on the partisan objections of RMS and the GNU project as well as from the Free Software Foundation on how they claim this impurifies Alan Turing's Universal computer Theory -- That is a computer is a machine that can do the same function as any other existing machine (printing press, fax, polygraph, cassette tapes, records, radio, television, etc) and how trusted computing can possibly limit the computers' abilities to do these things.

Thanks, --Mofomojo 06:10, 27 September 2006 (UTC)[reply]

pelase revert as soon as possible the paragraph "key concepts " , "possible uses" ,"disputed issues " to the original order .the article is unreadable and illogic now .Dbiagioli 15:23, 27 September 2006 (UTC)[reply]
i've done it myself as i've not received any response Dbiagioli 05:41, 30 September 2006 (UTC)[reply]

US army & trusted computing support

[edit]

``The U.S. army has also stated that every new PC bought by the army must support trusted computing [3]" - the referenced article DOES NOT state this.

Thanks for spotting this; I've removed the statement from the article. -/- Warren 17:28, 6 October 2006 (UTC)[reply]
well, the 'referenced article' _stated_ ' this  : see http://www.fcw.com/article95422-07-26-06-Web http://www.securityfocus.com/brief/265 ... and ,, by the way , if you look at the 'suggested models' https://ascp.monmouth.army.mil/scp/cb/cb_products.jsp and if you check them one by one (take for example https://ascp.monmouth.army.mil/scp/cb/cb_item_details.jsp?cat_id=4&ven_id=9 ) you 'll discover that everyone has got a TPM ... the US army has just decided that it does'n want to publicize that fact . Dbiagioli 06:45, 7 October 2006 (UTC)[reply]

You write at footnote 4 that "the link [no longer states] that pc must have a TPM." That doesn't mean that the Army dropped the requirement, does it? 10/21/2006 Jvsullivan 19:26, 6 November 2006 (UTC)[reply]

well , if' you re able to find proof of the fact that the army still requires a TPM , i'll be happy to change the page. I haven't found it Dbiagioli 14:06, 21 October 2006 (UTC)[reply]

I don't mean to be trouble to you, but you're making the affirmative assertion that the requirement was dropped. To base that affirmative assertion on the absence of evidence that it wasn't dropped doesn't seem very encyclopedic. If I come across a reiteration of the requirement, I'll certainly point it out. But I think you should reconsider characterizing the absence of a reiteration as a reversal. Thanks for your attention. Jvsullivan 17:32, 21 October 2006 (UTC)[reply]

the problem is , we have not any evidence that the requirement is still in force , so ,giving the fact that the army changed the page, we have to suppose that it was cancelled ... f course in my personal opinion the requirement was not dropped ,but , without the proof that the requirement is still in place , what should be written on the page ? we think the requirement is still in place but we've got no evidence of that ?? Dbiagioli 18:18, 21 October 2006 (UTC)[reply]

"We have to suppose"? The cited source says nothing to support the proposition that the requirement was dropped. What is it about an FAQ page that happens not to mention the continued existence of the requirement that compels publication of a supposition that the requirement has been dropped? This isn't adding up. Please take a look at the list accomplishments under strategic goal 3 in this October 2006 Army publication: http://www.army.mil/ciog6/news/500Day2006Update.pdf : "Require Trusted Platform Module (TPM) 1.2 for new computer buys" Thanks. Jvsullivan 19:35, 21 October 2006 (UTC)[reply]

that's enoug for me -- page changed .Dbiagioli 21:24, 21 October 2006 (UTC)[reply]

Apple?

[edit]

The article claims that Apple uses the TPM chip for the Intel version of Mac OS X. This information seems to be false. See [[1]]

well , it seems apple has changed its mind .. old macs had a TPM inside , as your link shows Dbiagioli 11:39, 2 November 2006 (UTC).[reply]
No, the article isn't saying that no Macs had TPMs inside. It's saying that TPM or no TPM, Apple never USED the TPM. Apple may have decided to stop including TPM chips in the new models, but the key point being made is that they NEVER USED TPM DRM.
ok, so why apple has included the TPM in some models ? and by the way, the fact that they haven't done it before doesn't mean than they are not planning to do it. Dbiagioli 18:24, 2 November 2006 (UTC)[reply]
by default the intel motherboards come with TPM's. Look at any other vendor's computers. I guess they just decided in the end to cut that little cost (it costs like $1 or so per motherboard) and just not include it. Let me say it again - look at other brands . It's very common to have onboard TPM, that doesn't mean all manufactures plan on using them. And as for planning to do, you can say that for anyone. Sheesh!
I have edited the article to reflect this. I removed Apple from the manufacturers ‘planning’ to use TC, and added a note that they — contrary to popular belief — do not. --DanChr 19:22, 28 August 2007 (UTC)[reply]

Endorsement Key Section

[edit]

From an inexperienced person perspective (hence why I'm reading a Wikipedia article on the subject!) there is a missing bit of information in the Endorsement Key section on how the signing of a random number proves the identity and validity of the TPM involved. I presume it is because the manufacturer or other trusted third party holds a copy of the public key and this is retrieved by the inquirer for the purpose of communication? If this or otherwise is the case I think it would be worthy of noting. Thanks. George Jenkins 21:22, 4 November 2006 (UTC)[reply]

no ,the new protocol the TCG uses to prove a TPM's identity (direct anonymous attestation) is much more complex and involves a lot of advanced math . i've added a link to its stub , but explaining it in a wikipedia article is very difficult ,IMHO . Dbiagioli 07:41, 5 November 2006 (UTC)[reply]

Relation to Pentium 3 PSN

[edit]

Does anyone know how this is different from the P3 PSNs? I seem to remember that they didn't catch on.

The PSNs didn't have a lobby group bribing the politicians... Fosnez 02:26, 28 January 2007 (UTC)[reply]

Lucky Green

[edit]

I got a question for any experts on the subject: On the internet, there is an abundance of sources (all of them several years old, AFAIK) speaking of the Lucky Green patent incident. For underinformed laymen like me: The story goes something like he filed a patent on using T.C. for commercial purposes right after a conference where some Microsoft spokesperson talked about it, negating a commercial intent of Microsoft on the grounds of "we didn't even know it could be used for that". Sorry for any inaccuracies... I just wondered why there is no mention of that incident anywhere? Mostly I'd like to know how things finally turned out, because there don't seem to be any up to date sources. Kncyu38 14:47, 30 December 2006 (UTC)[reply]

Turing Computer

[edit]

I suggest deleting the whole section titled, "Alan Turing's Universal Computer and the Free Software Foundation". Someone added a paragraph that helps clarify it, but it is still contradictory and confusing. The only worthwhile thing in the section is mentioning the relation to DRM, but even that is better explained elsewhere in the article. Roger 00:13, 8 January 2007 (UTC)[reply]

+1 i agree Dbiagioli 13:55, 8 January 2007 (UTC)[reply]
I disagree, I think it's important to show that it's not really a computer in the hands of its owner anymore. "They allege that a trusted computer is not a Universal computer, as defined by Alan Turing ‹The template Talkfact is being considered for merging.› [citation needed]. They say that because users can't switch software, they then cannot use free operating systems or participate in an open-source or file sharing community. They also state that through the enforcement of Digital Rights Management built into the hardware, that users are not really free to make their computer run whatever functions that they see fit and make it emulate any other machine." Tivoization also fits in this definition and I think more should be added back...
These comments don't even make any sense. A trusted computer is a Turing machine as much as any other computer. Even the trusted computer critics don't allege that trusted computers are not universal computers, as far as I can see. There is no source for "They allege". All I found was this trusted computing critic saying "you still have Turing-completeness".[[2]] So I am dropping the section. Roger 18:10, 13 January 2007 (UTC)[reply]
I will drop the Turing completeness because I don't know enough about it (and I did leave the "citation needed")--I just added it back because it was there for quite a while anyways. But the crux of the FSF-angle argument is that you can't change software isn't it? http://www.theregister.co.uk/2002/11/08/of_tcpa_palladium_and_wernher/WHy can't we include this sentence only then: "The open source angle

Alan Cox concentrated on the issues of who owns the platform and who owns the key, neatly using Xbox as an example. If you own the keys, then you have the ability to do what you like with the systems you've bought. Your changing the software would clearly have an impact on the trustworthiness of the keys, and people who had established a trust relationship prior to the change would quite possibly then not trust you. So you just go back to them and establish a new relationship, cool, and Alan's happy with that.

But if you don't own sufficient keys to change the system, and somebody else has the rights to say what you can and cannot do with the system, then the system is, in Cox's view, inherently insecure. Which is the case with Xbox. Cox also points out that where you don't own the keys, then "a third party can say you trust your cable provider" (we suspect Cox's cable provider may be something of an issue for him). More seriously, keys could be interfered with in the name of national security, and even the possibility of this happening clearly destroys trust." —The preceding unsigned comment was added by 74.112.116.90 (talk) 18:33, 13 January 2007 (UTC).[reply]

some (not all) EK (endorsement key) are revocable, with a password provided by the manufacturers. in pratice, most people will not be able to remove their EK from their TPM . however ,the TPM can still be disabled : on the XBOX360 , the chip (yes , also the XBOX has a TPM ) can't be disabled Dbiagioli 19:33, 13 January 2007 (UTC)[reply]

TC video

[edit]

There's an amusing video http://www.lafkon.net/tc/ mirrored http://www.cs.bham.ac.uk/~mdr/teaching/modules06/security/video/trustedComputing.html and on youTube http://www.youtube.com/watch?v=K1H7omJW4TI --Bah23 13:46, 6 February 2007 (UTC)[reply]

Foreign language

[edit]

``Trusted computing... ¡cuando la confianza supera los límites!" (linked) is not in English. What's WP policy on this? I suspect the link should appear in the TC article written in that language?--Bah23 16:40, 8 February 2007 (UTC)[reply]

Linked webpages in English are preferable, but if necessary foreign language webpages are allowed. Shinobu (talk) 14:57, 13 June 2008 (UTC)[reply]
The WP:NONENG policy supports Shinobu. Should we restore that reference that was deleted without comment?
I see a bunch of other references were deleted in a following edit. Should we restore those deleted references as well? --68.0.124.33 (talk) 05:14, 31 May 2009 (UTC)[reply]

Key Point not right

[edit]

I removed this:

Key Point - Trusted Computing is not about your computer being trusted by you, but about your computer being trusted by authorities, and organisations providing the hardware, software and content you will use on your computer. With TC, the term your computer may be inaccurate - it is more a piece of equipment such organisations have granted you the use of and may disable that use if they consider your use contrary to their interests.

This isn't correct. If you have TC features on your computer, then those features can be used to assure that you can trust your computer. Also, there is not necessarily any ability for others to be able to disable your computer. Roger 18:29, 4 March 2007 (UTC)[reply]

Urm, I agree that it's not encyclopedic but your argument against it is just as flawed ("[TC] features can be used to assure that you can trust your computer"). Prehaps we can compromise by using a FSF quote in it's place? 172.213.231.215 12:16, 3 September 2007 (UTC)[reply]
The FSF view is already described. Roger 15:22, 3 September 2007 (UTC)[reply]
The section should not be there since the tone is non-encyclopedic, not because it is wrong. Frankly, the TC features don't protect you from anything. They protect anything from you. TC stops you from cheating in online games, but it doesn't stop your opponents who may be running on a pre-TC machine. So the trust there is entirely misplaced. You can run TCP, and sure enough, your opponents are still able to see through walls, aim perfectly and never miss, etc. You can only really 'trust' that you yourself are not cheating. However, if you need a hardware solution to know if you downloaded or wrote a cheat patch for a game... you need to get into therapy. And this goes down the line of 'benefits.'76.118.215.233 (talk) 05:20, 7 March 2008 (UTC)[reply]

Implicit pro bias?

[edit]

This article seems implicitly biased in support. In particular it makes no mention of the considerable controversy surrounding the issue in the introductiory paragraphs, deferring that until after masses of technical details have been unloaded. (It's overly long in any case.) Also it seems to very much play along with the rhetoric of the proponents (overuse of truisms essentially like: trusted computing is computers that are trustworthy and that have trustworthy components). The statements about spam have been widely discredited and certainly shouldn't appear unopposed as they do. It's an unenlightening greywash. I'm not well informed enough to redress the balance but I feel expert attention is needed. 172.203.199.18 17:49, 10 August 2007 (UTC)[reply]

Linux Kernel

[edit]

"The Linux kernel has included trusted computing support since version 2.6.13, and there are several projects to implement trusted computing for Linux."

Link anyone? or reference ?

"trusted computing support"?. Only as of Linux Kernel 2.6.36 is there even a patch to enable trusted encryption keys for the IMA. A trusted encrypted file system and IMA for attestation, etc. is able to be hacked together but is not fully supported by the mainline kernel. http://linux-ima.sourceforge.net/ Try building a trusted linux platform for yourself and you will realize the current state of affairs.

VmZH88AZQnCjhT40 (talk) 05:00, 8 January 2011 (UTC)[reply]

Encyption -> increased power requirements

[edit]

It should be mentioned that the 2048-bit strong encryption requires significant processing power, which in turn means increased energy requirements. In the case of protected high-resolution videos this will mean a LOT of energy.

It's interesting to compare this with the current efforts to save the environment by not wasting energy. Nokia phones will warn you to unplug the charger when not used to eliminate its small quiescent current draw, while your computer will happily waste the double of the normally necessary power to play back a film. —Preceding unsigned comment added by 85.181.100.68 (talk) 15:16, 31 October 2007 (UTC)[reply]

The 2048 bit key will only be used to encrypt small, critical bits of information (such as other keys of less strength, and a digital signature), not entire files.

Leotohill (talk) 01:31, 27 May 2009 (UTC)[reply]

"Possible" uses?

[edit]

Does the header "Possible uses of TC" mean that they haven't been implemented yet?

I just want to make sure that DRM isn't a function of TC yet. (ZtObOr 03:45, 11 November 2007 (UTC))[reply]

(Sorry, forgot to add signature.)

today , there are only prototypes (like Turaya.FairDRM application of the EMSCB [3] ) that use use TC hardware in order to enforce DRM . Dbiagioli 17:41, 11 November 2007 (UTC)[reply]
DRM is inherently a "function" of the TC as designed and specified by the Trusted Computing Group, just as turning left and right is a "function" of cars. As for implementation, (?)most(?) laptop computers and (?)many(?) desktop computers you buy are already shipping with the Trust chip fused to the motherboard and carry CPUs with explicit support for it. However Microsoft has delayed releasing explicit Operating System usage of Trust chips and very little software out there accesses these Trust chips yet. So the answer to your question is a complicated mix of yes and no. Pretty much yes the DRM hardware is already out there and yes they might already have shoved it down your throat hidden in the last computer you bought, but no they haven't really activated it... yet. Alsee 19:55, 4 December 2007 (UTC)[reply]
Bitlocker uses TC as of Vista. The primary function of TC is quite far from DRM. As a windows or macos user you already bought into DRM. Linux won't support it, so what will change?VmZH88AZQnCjhT40 (talk) 05:05, 8 January 2011 (UTC)[reply]

Cell Processor

[edit]

More than a year ago I posted to the Cell_(microprocessor) talk page that the Cell processor had hardware DRM / Trusted Computing support and asking for some sort of coverage in the main article, and all that has come of it is a Talk Page accusation that this is "fantasia talk" and someone else all-too-typically sweeping it under a generic "security" rug and of course dismissing all Trusted Computing / DRM issues. However the fact of explicit DRM in the hardware is documented in the very title of IBM's own PDF publication on the Cell: "Cell Broadband Engine Support for Privacy Security and Digital Rights Management", and Trust design explicitly covered on the very first page. (IBM took down the PDF at the original link I had posted, but the Google link I just gave finds the paper at multiple other locations). I have read some other papers from IBM itself documenting crypto keys and crypto mechanisms embedded in the Cell chip, however I have been having a very difficult time locating adequate coverage and explanation on it. I have only a piecemeal understanding of the ultimate DRM/TrustedComputing implications of the design, and I do not feel confident writing it up in the main Cell article. Is there maybe anyone over here who happens to be familiar with these aspects of the Cell design who could add some mention of this issue to the Cell article? I hesitate to do a half-assed writeup myself, and I don't relish the prospect of digging around for enough technical documentation trying to to develop personal expertise on the Cell design. I already spend all too many hours studying the entire low level technical design of the TPM chip, chuckle. Alsee 19:33, 4 December 2007 (UTC)[reply]

i'm not faimliar with the cell architecture, but i can assure that your paper exactly describe a TCG architecture, including trusted boot , the hypervisor kernel (you can see the general trusted stack architecture here [4] ) . I'm not sure if the cell engine is *officially* TCG-compliant .Dbiagioli 20:08, 4 December 2007 (UTC)[reply]
[edit]

i have a question : how many links should be in the "extenal links " section ? how many references can have an article ? is there a wikipedia guideline about it ? Dbiagioli (talk) 20:57, 18 December 2007 (UTC)[reply]

"Secure Computing" which focuses on anonymity

[edit]

The article includes this sentence: "Contrast Trusted Computing with secure computing in which anonymity, not disclosure, is the main concern." Clicking on the link to secure computing takes you to an article about computer security. However, anonymity is NOT the primary concern of secure computing as described there, and if "secure computing" is in fact a different concept forcused on anonymity, then I haven't found anything about it online.

Could someone please either explain or correct this?James mcl (talk) 18:27, 31 January 2008 (UTC)[reply]

corrected Dbiagioli (talk) 14:33, 10 February 2008 (UTC)[reply]
Thank you Dbiagioli! James mcl (talk) 14:19, 20 February 2008 (UTC)[reply]

Trusted computing and grid computing

[edit]

I do not know much about trusted computing, so please bear with me. I would like to understand what (if anything) trusted computing is about aside from its contraversial role in preventing unauthorized copying of music and programs. After reading the article, I still do not think I understand the whole story.

I have only seen three use cases for trusted computing that seem at all interesting: large-scale distributed computations ("grid computing") where the owner of the client machines should not necessarily be trusted, like SETI@home; DRM; and games. (I think the others described in this article require a counter-productive assumption that separate computers ought to be distinguishable. In the 'Protecting hard-drive data' example, what if the motherboard breaks so I must replace it and I need access to my data? In the ' Identity theft protection', what if I have to switch from one computer to another in the middle of a transaction, or what if the bank changes servers to deal with increased number of customers, or relocation? But I digress.) The first of these use cases (grid computing) is not emphasized at all in this article, but it sounds like just the sort of thing that would get a researching excited about ideas like trusted systems.

Did the model of trusted computing described in this article come from academic work, and what motivated it? What was the early history of its implementation like? What originally motivated manufacturers to include trusted computing chips, before this became such a big political issue in connection with copyright and trade secrets? Or was copyright the original motivation for the work?

I think it is the Endorsement key that people have a problem with and connect with the idea of trusted computing. After all, a trusted I/O path, memory protection, encrypted storage, and attestation of various facts by trusted parties are not new ideas and are often made possible through the (implicitly trusted) operating system. Why should we consider hardware more trustworthy than software? But the endorsement key means that a user does not have full specifications of all the relevant details of his computer and he has sacrificed some control for other benefits. Thus all the talk (in the "Possible applications" section) of the benefits of a trusted path, verification of identity, and digitial signatures did not seem to be very convincing or relevant. Am I missing something?

Projects like SETI@home face a real problem in the possibility of a malicious user ruining the whole project with fake data. It is really exciting if we have found a way to surmount that problem. Does trusted computing provide this? If so, how? These are the kind of questions I wished this article had answered!

Thanks for your work so far in maintaining this article on a contentious issue. 209.252.104.131 (talk) 01:01, 26 April 2008 (UTC)[reply]

Consumer and commercial operating systems have demonstrated an abysmal track record with regards to security and trustworthiness. Memory protection is non-existent after the kernel is compromised. x86 lacks write protection for memory. The same concerns in grid computing apply to the use of a single-core workstation due to the poor security architecture of consumer and commercial operating systems as they have classically evolved from performance and multi-tasking/user pressures, security has always been second. TC provides a record of execution and file access that is not subject to subversion from untrusted code (kernel/user).VmZH88AZQnCjhT40 (talk) 05:10, 8 January 2011 (UTC)[reply]

Outdated reference

[edit]

"^ Tony McFadden (March 26, 2006). TPM Matrix. Retrieved on 2006-05-05." http://www.tonymcfadden.net/tpmvendors.htm The page does not longer exists. I hope someone can find an equivalent reference. —Preceding unsigned comment added by 81.202.73.10 (talk) 15:26, 25 May 2008 (UTC)[reply]

The endorsement key CANNOT be used for signing

[edit]

every Trusted Platform Module (TPM) is required to sign a random numb

This is NOT true. The endorsement key cannot be used for signing. —Preceding unsigned comment added by 147.188.192.41 (talk) 13:43, 30 May 2008 (UTC)[reply]

Treacherous Computing

[edit]

Someone removed the text on how opponents/critics of Trusted Computing call it Treacherous Computing. Here are numerous sources: [5], and [6] many of which are reliable, backing up the use of this term in this context. It belongs on this page. Cazort (talk) 15:05, 3 March 2009 (UTC)[reply]

Even if a source is verified, it can still be very biased, so the content was removed for a good reason. Perhaps if you can get a strong enough consensus, then you can change it without any retaliation. ZtObOr 01:12, 3 May 2009 (UTC)[reply]
The term Treacherous Computing by opponents of the technology is discussed not only in numerous computer news sources as I showed above, but also in scholarly books published by Springer, such as these two: [7] and [8]. The 2004 article by R. Anderson is listed by google scholar as having 76 citations, which places it is among the authoritative works on the subject. These sorts of sources are about as good as it gets when it comes to editorial integrity and influence. Perhaps the old text relied more on biased sources...but even there, wikipedia has no business excluding sources just because they are biased--the use of the term "treacherous computing" has more than enough coverage in news sources...it's not just that Richard Stallman and the Free Software Foundation have released documents coining some new term...it's that doing so attracted media coverage of the stance and the new term. Not only did the Sydney Morning Herald: [9] and BusinessWeek: [10] publish Stallman's opinions, but you have people then writing ABOUT Stallman's opinions, e.g.: [11]. Cazort (talk) 15:15, 4 May 2009 (UTC)[reply]

Edit war on trust section

[edit]

I think it's not constructive to keep deleting and re-adding this text. The text needs to be sourced. In the absence of sources I think it's better to omit it than include it. However, I think some of the existing sources already included on the page actually back some of it up. We need to go through these sources and read them, and find out what of this paragraph can be backed up, by what sources...and if anything cannot (and I supect all of it can be backed up probably by existing sources) then it should be deleted. Cazort (talk) 20:38, 9 March 2009 (UTC)[reply]

Dubious

[edit]

Protection of biometric authentication data
This argument is a straw man as from a security context biometric data is always public. It would be completely infeasible to prevent anyone from taking a picture of your face or finger, and forming a biometric model of you.Scientus (talk) 19:05, 2 May 2009 (UTC)[reply]

Repeat Section

[edit]

The Digital Rights Management is almost exactly repeated under possible application as well as criticisms. I'm not sure which should be removed/changed, but I think that we don't need the same information repeated twice. Luminite2 (talk) 19:35, 5 May 2009 (UTC)[reply]

Protection from viruses and spyware section

[edit]

This section has some problematic claims:

1) "However, Microsoft has denied that any such functionality (virus protection) will be present in its NGSCB architecture. This needs a reference. Even then, MS's failure to include virus protection in NGSCB may have little bearing on whether TC can indeed be used for this purpose. Without further information, it's hard to know.

2) "In practice any operating system which aims to be backwards compatible with existing software will not be any safer from malicious code.[1] I can't find anything in Anderson's document that supports this. (Maybe I missed it?) I also find it dubious: closing ANY attack vector makes a system safer. It may not be "safe enough", but it is safer. If TC can be used to close any attack vector (can it?) then it can be used to make a system safer.

If we remove these claims, there's not much left for this section. I'd then be inclined to delete the whole thing unless we can find some more material for it. I'd like to see a description of how TC can be used to prevent malware infections.

Leotohill (talk) 01:52, 27 May 2009 (UTC)[reply]

Sealed Storage - platform sensitivity

[edit]

This section contains speculative claims. The sensitivity to machine configuration depends upon the characteristics of the software that is implementing restrictions. TPM provides multiple registers that record different aspects of the configuration. A process that uses the TPM to provide security can choose which of these it cares to be sensitive to. Since there is no current DRM scheme that uses TPM, any claims about its sensitivity to machine configuration, beyond the TPM root key, are speculative. Leotohill (talk) 00:16, 30 May 2009 (UTC)[reply]

Does anyone care to refute that the current TCG TPM architecture is incapable of being used to enforce machine configuration? As I see it, to implement this on a typical commercial operating system, you would need a chain of trust through the entire privileged access and execution chain until the configuration is verified, which is impossible prima facie given the heterogeneity of systems in use. I think the article would benefit from the addition of material supporting that the current state of technology as it interacts with x86 and modern moperating systems cannot support DRM enforcement with the aim of preventing a theoretical digital capture of protected content. VmZH88AZQnCjhT40 (talk) 02:47, 11 February 2011 (UTC)[reply]

Key concepts section is largely incorrect

[edit]

I don't know where these "key concepts" came from, but some are incorrect.

1) the text stated that Secure I/O is a key concept, but it isn't mentioned in the spec. I've removed it from the article.

2) nor does the TPM spec mention curtained memory. Nor do any of the other references listed below. A web search will find a number of documents that refer to curtained memory and the TPM, I haven't found any that provide a definitive reference. These may have come from early design proposals for a TPM that didn't make the final version of the spec.

The TPM spec does describe "shielded location" which "is a place (memory, register, etc.) where data is protected against interference and exposure, independent of its form. However, unlike Intel's Trusted Execution Technology and other general discussion of memory curtaining, a shielded location is not owned by a process, and is not directly accessible by the CPU. In the TPM, the content of a shielded location is only available via the purpose-specific commands that the TPM provides.

The TPM overview document also describes "protected storage" which I read as a special case of "shielded location". Again, it is not memory that is accessible by the CPU.

So I see the term "curtained memory" as incorrect here, and I'm inclined to edit out the references to it, unless someone can provide a better answer.

These are the references I searched for "curtain" with no hits:

http://www.trustedcomputinggroup.org/files/resource_files/AC652DE1-1D09-3519-ADA026A0C05CFAC2/TCG_1_4_Architecture_Overview.pdf http://www.trustedcomputinggroup.org/resources/tpm_specification_version_12_revision_103_part_1__3 http://www.amazon.com/Practical-Guide-Trusted-Computing/dp/0132398427/ref=pd_bxgy_b_text_b# http://www.amazon.com/gp/product/0750679603/ref=s9_cart_gw_tr02?pf_rd_m=ATVPDKIKX0DER&pf_rd_s=center-4&pf_rd_r=0F6ZXMXAM6BSTS9QEV0P&pf_rd_t=101&pf_rd_p=470939031&pf_rd_i=507846# http://www.amazon.com/Trusted-Computing-Platforms-Design-Applications/dp/0387239162/ref=pd_bxgy_b_img_c#reader

I plan to rewrite this section as soon as I feel comfortable with the material - but I encourage others to go ahead.


Leotohill (talk) 15:28, 19 June 2009 (UTC) and further revised Leotohill (talk) 02:12, 28 June 2009 (UTC)[reply]

Trusted Computing versus trusted computing

[edit]

The start of the article defines Trusted Computing as what the Trusted Computing Group does. Given that definition, the sub-section Trust says "In some proposed encryption-decryption chips...", thus surely that text must be out of scope. It is also full of citation needed marks, so I will delete that portion.


Dids (talk) 05:09, 27 October 2009 (UTC)[reply]

The common reader does not fully understand AES and public key encryption, in order for the section to be understandable and not gibberish to 95% of readers it has to be written the way it was. [12] [13] [14] Scientus (talk) 03:16, 9 March 2010 (UTC)[reply]

Proposed revamp

[edit]

I volunteer to contribute a significant amount of new text, replacing all existing text up to the section “Known applications” and leaving the rest (including the criticism) untouched. I was recently told of the current page and was struck that it is sketchy or silent on many aspects of trusted computing. I can provide a description of Trusted Computing that has been honed over 11 years of designing and describing the technology. My standard spiel mentions what technology exists (and some of what doesn’t), the real reason why it’s called “Trusted Computing”, the way that Trusted Computing protects data (plus its security properties), the essential difference from secure computing, and what Trusted Computing insiders consider to be the core principles of the technology. I (obviously) believe that Trusted Computing is a “good thing” but every technology has aspects that can be difficult to implement, and lots of technologies can be used in undesirable ways. My spiel therefore also mentions difficulties with the technology, and describes the concerns that have been told to Trusted Computing insiders over the years, plus the status of attempts to resolve those concerns. According to Wikipedia’s help pages, someone in my situation should discuss my intentions here before editing the main page, so here I am. I’m in no way a Wikipedia expert. What’s next? Walklooker (talk) 17:50, 20 January 2010 (UTC)[reply]

I've posted draft text at User:Walklooker/draft for `trusted computing' Walklooker (talk) 11:48, 22 January 2010 (UTC)[reply]

It was unproductive to `undo’ the revamp. The reinstated version confuses, contains factual errors, is sparse and patchy on actual trusted computing, and does not mention all the core properties of trusted computing or distinguish them from optional properties or specific implementations. Further, the reinstated version has a systemic fault, in that the description of trusted computing is dominated by a description of a classical DRM system that uses trusted computing as a system component to restrict distribution of data. This engenders confusion because trusted computing does not specify that platforms can’t distribute data and actually includes mechanisms to assist the distribution of data. Any DRM description involving trusted computing should be in the DRM subsection of the `Trusted Computing applications’ section, or even on the Wikipedia page describing Digital Rights Management. The trusted computing description should instead describe trusted computing, which is a tool for protecting data, designed on the principle that there is no such thing as universally preferred behavior. Trusted computing provides a form of access control that helps enable the selection of preferred behaviors, and helps enforce a selection if one is made, but doesn’t dictate behavior, or even insist that a selection must be made. In other words, trusted computing does not fix behavior but does protect data by allowing the association of data with any arbitrary behavior, and includes methods to enable the determination of current behavior. This enables protection of data in a wide spectrum of tasks, such as performing one’s business activities, performing personal activities, viewing one’s financial information, viewing one’s medical information, accessing different services or being admitted to sensitive networks, and even just casual Internet browsing. What needs to be explained is what trusted computing is, how it works, and the state of the art, and that is what the revamp did.

Here are examples of concerns with the current (reinstated) version:

• "The term is taken from the field of trusted systems". Actually trusted computing is called trusted computing because it is a technological implementation of a trust process, not because it must simply be trusted.

• "With Trusted Computing, the computer will consistently behave in expected ways, and those behaviors will be enforced by hardware and software". Actually trusted computing mechanisms enable any selection of expected behavior to be enforced but do not check or constrain the selected behavior. Hopefully a selected behavior is consistent, and the computer will consistently behave in expected ways, but trusted computing can’t change the behavior of software, or compensate for faults in software.

• "Enforcing this behavior is achieved by loading the hardware with a unique encryption key inaccessible to the rest of the system". It's hard to be sure what this means. Trusted computing uses cryptography to help enforce a selected behavior, but encryption keys need not be unique and might be loaded or generated locally on the trusted computing hardware.

• "The main functionality of TC is to ensure that only authorized code runs on a system". This is far too simplistic. Trusted computing has no way to constrain the software that can execute on a computer. Even when trusted computing mechanisms are used, there is nothing in trusted computing to constrain the choice of software or behavior that must be associated with data. Anyone with unrestricted access to plain-text data can associate any software with that data using trusted computing.

• "However uncooperative operating systems do can misuse security features to prevent legitimate data exchange!" This comment could apply to any type of computer. Trusted computing does not encourage this type of behavior.

• The current description describes in multiple places the properties of a classical DRM system that uses trusted computing as a system component. This should be described in the DRM example, and explained as an application of trusted computing, namely that a third party associates third party data with a behavior that restricts the distribution of data. The DRM example should also explain that that a third party cannot discover or use the trusted computing mechanisms without permission from the computer’s owner.

• "Trusted computing encompasses six key technology concepts, of which all are required for a fully Trusted system". This confuses a DRM system with trusted computing. Further, trusted computing has three core concepts, one of which is not even mentioned in the reinstated description. In contrast, "secure input and output" is not an essential part of trusted computing and "memory curtaining" is a desirable but not essential component of a trusted computer (which needs isolation mechanisms); it may be present in some implementations and absent in others.

• "every Trusted Platform Module (TPM) is required to sign a random number". The accurate statement would be that every Trusted Platform Module (TPM) is required to be able to sign. Trusted platforms never need to do this unless the computer owner decides to reveal that they have a genuine trusted computer.

• "[EK] makes it impossible for a software TPM emulator, with a self-generated Endorsement Key, to start a secure transaction with a trusted entity". Actually nothing prevents the use of a software TPM with its own EK for a secure transaction with a trusted entity, if the entity trusts the software TPM. The EK in a hardware TPM just makes it impossible for a TPM emulator to pretend to be that hardware TPM.

• "[Sealed Storage] means the data can be read only by the same combination of software and hardware". This confuses the properties of trusted computing with that of a DRM system. Sealed Storage releases data to a particular combination of software and hardware, but that software can propagate the data in any way it chooses, and trusted computing provides mechanisms to help propagate the data. Hence data protected by trusted computing could be read by other software and hardware, or just by one combination of hardware and software.

• "This will prevent people from buying a new computer, or upgrading parts of their current one except after explicit permission of the vendor of the old computer." This confuses the properties of trusted computing with alleged effects of DRM systems. In trusted computing, the only upgrades that require permission from an OEM are upgrades to the trusted computing mechanisms that enforce reliable selection of behavior.

• "Remote attestation allows changes to the user's computer to be detected by authorized parties". This is overly simplistic. Remote attestation allows changes to the user's computer to be reported by authorized parties, such as the computer owner. It’s the authorized party that authorizes reporting who approves the receipt of attestation information by remote parties.

• "The computer can then present this certificate to a remote party to show that its software has not been tampered with." This is inaccurate, because the `certificate' only shows that that unaltered software is currently executing, not whether software has been tampered with.

• "Remote attestation is usually combined with public-key encryption so that the information sent can only be read by the programs that presented and requested the attestation, and not by an eavesdropper, such as the computer owner." This is at minimum misleading and arguably incorrect. There is no mechanism in trusted computing to prevent the owner obtaining an attestation value. In fact attestation cannot operate without permission from the computer owner. Encrypted attestation serves only to protect the owner's privacy from eavesdroppers. If the owner does not want privacy, the attestation value need not be encrypted.

• "secure I/O prevents the user from recording it as it is transmitted to the audio subsystem". This confuses the properties of trusted computing with that of a DRM system. There is no mechanism in trusted computing to prevent the user from recording an output.

• "remote attestation protects it from unauthorized software even when it is used on other computers". It's hard to be sure what this means. It might mean that remote attestation can be used to ensure that a trusted computer will respect the data that it is accessing on a remote computer.

• "Remote Attestation use, however, has been discouraged in favour of Direct anonymous attestation". It should be made clear that DAA is a substitute for a Trusted Third Party, not a substitute for remote attestation. An additional complication is that DAA can be used with a Trusted Third Party.

• The current description states in multiple places that a CA generates an AIK. This is incorrect. In fact the TPM generates AIKs and a CA provides certificates for AIKs.

• "These three credentials will in short be referred to as "EK". The EK is a platform specific key that uniquely identifies the platform." These statements are contradictory and unhelpful because, in trusted computing, the EK is a key and is not a certificate or three certificates.

• "The EKpub will uniquely identify the endorser of the platform, model, what kind of software is currently being used on the platform, details of the TPM, and that the platform (PC) complies with the TCG specifications". This is confusing because various certificates, which reference the EKpub, describe the platform properties. Also, these certificates do not identify what kind of software is currently being used on the platform.

• "Allegedly, this will provide the user with anonymity". In fact it will provide pseudonymity unless each AIK is used only once.

• "If the Verifier accepts the DAA supplied it will produce an AIK" Again, it’s the TPM that produces AIKs.

• "If the anonymity of the user as a whole will be increased by the new version is another question". There's mathematical evidence that DAA anonymizes the TPM. All that trusted computing can try to do is not make anonymity any worse than it already is (because of other factors).

• "One can easily question the objectives of the Issuer, since this most commonly will be the platform manufacturer." The issuance of an EK and EK credential (or a DAA credential) by an OEM is a value-added service provided by the OEM to their customers. Without such credentials, it’s difficult for a private customer or small business or organisation to convince others that they have a genuine trusted platform. (A famous company or organisation could produce its own credentials and reply upon its brand to convince others that they have a genuine trusted platform.) There's mathematical evidence that an Issuer can't use DAA to obtain or deduce any linkage between a specific trusted computer and the usage of that computer.

• "Another key question is what kind of information will be supplied to the Issuer in order to obtain the DAA credentials". The answer to this question is that the DAA protocol uses the pubEK (and by implication) the EK certificate to create DAA credentials. Walklooker (talk) 10:32, 4 April 2010 (UTC)[reply]

your proposal was technically correct , and it is possible that there are some errors in the current version; however ,your version is difficult to read for a user who is not an expert . your version als does not describe in detail how a TC-compliant platform really works ; however , if you start doing some minor edits , and not replace entire paragraphs ,we could improve the current version .Dbiagioli (talk) 20:02, 13 April 2010 (UTC)[reply]

Thank you for making constructive comments. I actually did try simple editing (on a local copy of the Wikipedia page) and suggested the revamp because I ended up with more new text than original text. If we start again, the structural changes would be removal of the confusion with a DRM system (perhaps putting that information into the `DRM application' section), followed by corrections and new information. So there will still eventually be large alterations. Would you be prepared to help with this? Walklooker (talk) 09:23, 20 April 2010 (UTC)[reply]

i'll help , perhaps in the weekends ,when i've got enough time . The 'DRM application ' section could be a nice idea . Perhaps we could also add a section on the 'trusted software stack ? ' Dbiagioli (talk) 18:56, 20 April 2010 (UTC)[reply]

Not all new computers have one of thest

[edit]

I bought a new computer in 2012 that did not come with this, so article is wrong. --209.188.32.201 (talk) 17:17, 12 July 2013 (UTC)[reply]

Hm, where actually the article is wrong? Of course that not all recent PCs come with this capability. — Dsimic (talk | contribs) 08:48, 5 August 2014 (UTC)[reply]

External links content to be put into article prose?

[edit]

Official sites

[edit]

Software applications

[edit]

Criticism

[edit]

Removing 'Trusted third party' section and linking to that article

[edit]

The 'Trusted third party' section in this article has had a 'needs cleanup' tag on it since May 2010 because it has no citations and is not written in an encyclopedic tone. I'm replacing this entire section with a link to Trusted third party, because that article appears to cover the topic much more thoroughly and more properly. It would be nice for the 'Trusted third party' section here to have a brief summary of the topic in addition to the link, but I wasn't able to learn enough from that other article to summarize it in a useful way. - Brian Kendig (talk) 17:26, 7 January 2024 (UTC)[reply]

Speculative content is not encyclopedic knowledge

[edit]

A substantial portion of this article is devoted to speculation about potential applications of computer security.

Speculation about potential application for new technologies isn't knowledge. 99 44/100ths% of it is as ephemeral as a soap bubble and yet less captivating and entertaining.

Its enough to say that the cost of being unable to prevent altered and malicious software from interfering with the intended function of everything from industrial machines to music players is sufficient to inspire innovators to apply every available tool, including trusted computing initiatives.

A final reason not to put speculative applications here is that experience has shown that security concepts simple enough to explain in a few sentences have proven to be easily defeated. Application proposals sufficiently complex to worth knowing about will also be long enough to merit a Wikipedia article of their own.

As an editorial comment on the pro/con optimist/defeatist saga here; computing technologies have been proposing and speculatively promising more than could be delivered in a generation since the 1950's. Like the bubbles on top of a glass of champagne, without them it would be much less, but they don't really add anything. TPM is the most recent in a long line of timely initiatives to increase the difficulty of defeating computer security. Its good and its helpful but without being able to prevent physical access to the platform it remains the same problem as communicating over an open channel. The time and cost to intercept and alter can be increased, but never insurmountably. PolychromePlatypus (talk) 22:15, 5 May 2024 (UTC)[reply]

  1. ^ Cite error: The named reference Anderson was invoked but never defined (see the help page).