A user’s trust and distrust in information systems [IS] are important components in the interactive relationship between users and their systems. A user has to trust a technology before the technology is adopted and fully used. While there is a rich literature on interpersonal trust, trust in information systems has been under-researched...I will not attend the conference, but I found the CFP thought-provoking. Trust is something we usually extend to other people and to ourselves. Is it sensible to talk about trusting a machine? If so, what are the dimensions of trust in an IS? Here are a few top-of-the-head possibilities:
• “Application not responding” (or for Mac users, the dreaded pinwheel). If an application program habitually hangs up, losing your content and wasting your time, you will “Save” like a paranoiac, or stop using the app altogether.
• Programmers’ work conditions. If you know it was coded in a Mumbai sweatshop, your expectations of quality will be low.
• Billing mistakes. Just try to call your cell phone provider or the cable company to dispute a computer-generated bill.
• Machine breaks, and manufacturer won’t make good. “Sorry, Sir/Madame, that defect is not covered under warranty.”
• Corruption in construction, shoddy materials. OK, very good electronics are often packaged in cheap cases with sticky keyboards. But don’t buy a nuclear power station built by a Mafia-owned company that put too much sand (and a few dead bodies) in the concrete.
• Identity theft, malware, eavesdropping. This will be the desperately needed salvation of the US Postal Service: People will soon remember that snailmail rarely carries viruses!
• Privacy policies not adhered to. Facebook. Need I say more?
• Tools cast instead of forged. A high torque on a cheap cast-metal crescent wrench put you in the hospital once. Now you only “trust” a forged wrench of high-quality steel.
• Level of safety features built in. Your “trust” in a Swedish safetymobile exceeds your trust in a Yugo. Especially on a California freeway.
• When humans use e-commerce to screw around with you. The item costs $4.95, but five screens later (and they know that every click builds the shopper’s commitment to purchase) you find that the shipping&handling is $8.50.
• In social networks, when you’re not sure whether it’s the system or another user who’s messed up. Here’s a screen shot from Facebook, unretouched except to redact the person’s name. Is he twins? (Hint: No.) Did he accidentally start two Facebook accounts? Has F’book’s algorithm miscounted how many friends he has?
• Complexity of instructions reduces your trust that you can follow them correctly, and reduces your trust that the manufacturer is capable of making a usable product.
• Content that corrupts the young. Phonographs mean people can dance in private, potentially leading to licentious behavior! TV shows that numb the mind and the morals. More seriously, middle-school bullies on social networks. Or, conversely, machines that bend too far away from politically charged content, viz., the buzz about Apple's Siri 'refusing' to refer inquirers to abortion providers. Oh, and wikis that any wacko can edit.
• Effect of faulty information systems on our trust in the host institution. Yesterday I phoned Chase Bank, entered both my account number and social security number twice before realizing the phone menu had put me in an infinite loop. Does this enhance my trust in Chase? (Hint: No.) We have to rely on 3rd-party helpers like gethuman.com to help us deal with IS owners, in anything approaching an effective way.
• Generalized angst about being a slave to one's phone or twitter feed, or about the impact of technological advance on the global environment.
What would you add to this list?
As you see, almost all the above are really about trust in the people behind the IS, not in the IS itself. (And many of these “trust” items are implicit in Everett Rogers’ classic list of determinants of technology adoption.) It’s possible this is what the conference organizer meant, as the word “system” implies both the machine and the people interacting with it – its creators, users, vendors, and support staff.
Nonetheless, it’s a perversion of language to speak of trust in a machine. Trust means we have confidence both in a person’s intentionality and his/her capability. Machines don’t (yet) have intentionality. Let’s not murder the language unnecessarily.
When Von Neumann machines start getting smarter and designing their own offspring, that'll be a horse of a different color. Then we might trust or distrust machines themselves, rather than the people behind them.
Perhaps one of you will blog on an allied topic that’s even more fascinating: Does the machine trust the person? Fingerprint, retinal, vocal, and facial recognition for authentication of the user raise a host of issues. Not the least of them is whether you’re willing to let a machine (that you don’t trust) ask you for a DNA swab!
Comments