When you were a kid on the playground, remember how you hated to hear “tag, you’re it!” Well, now those words have taken on a whole new meaning.
Years ago, pet owners began microchipping — or tagging — their dogs and cats to ensure they could be identified and returned if lost or stolen. About the size of a grain of rice, pet chips can have a shelf life of up to 20 years. Well, if it’s good enough for Fido….
Earlier this year, in what seems to be the first large-scale workplace experiment, more than 150 workers at Swedish startup hub Epicenter agreed to have RFID (Radio Frequency Identification) microchips implanted in their hands. Made by Biohax Sweden, the chips use what’s called Near Field Communication (NFC) technology — the same technology used by contactless credit cards and mobile payment applications like Apple Pay — to collect data that other devices can read. Like a swipe card, the embedded chip transmits stored information when waved over an RFID reader.
This summer, partnering with Biohax, Wisconsin’s Three Square Market (32M) became the first U.S. employer to offer microchipping to its employees. All it takes is inserting a $300 chip underneath the skin between your thumb and forefinger. Then, like magic, you can open security doors, log into computers, buy snacks and so much more with a mere wave of the hand. Just imagine — no fumbling for keys, searching for your company badge, or trying to find that one credit card that offers extra rewards. Once tagged, you no longer need them. Talk about convenience! When Cole Porter wrote the classic standard “I’ve Got You Under My Skin” in 1936 — and urged us to “use your mentality, wake up to reality” — I’m sure he didn’t have this in mind! But that was then, and this is now.
The technology’s been around, and Biohax has been selling the chips, for several years. So why haven’t more employers and employees jumped on this bandwagon? Data security and privacy concerns for users and third parties of the sort that initially surrounded wearable technology probably dictated caution. Like Big Brother, an embedded chip can track a worker’s whereabouts, movements, and how much time’s spent in places like employee break rooms — much of the same data that a company swipe card or smartphone can provide. But an employee can’t control its information collection or flow, and chips may be fertile fields for hackers. There’s no on-off switch, so business and personal data’s at risk 24/7. And any reasonable expectation of privacy goes out the window.
Wearables — from smart watches to Fitbits — are now so mainstream that employers are using them in employee wellness programs. Could implantable technology be the next big thing? When employers take a hard look at the ethical, legal and practical challenges that employee chip implantation may bring, will they choose to go down that road? And what are the rules of that road? While there’s currently no controlling federal law, some states have already staked out their positions and begun to regulate chip usage. Nearly a decade ago, California barred the mandatory implantation of RFID or other subcutaneous identification devices in humans, expressly prohibiting conditioning employment, promotion, or other employment benefits on an applicant’s or employee’s consent to it. Missouri, North Dakota, Oklahoma, and Wisconsin laws similarly bar employers from requiring employee implants. And a lawmaker in Nevada is pushing for it to become the fifth state to ban microchipping people without their consent.
Nearly 40 years ago, Lee Majors starred as a bionic man — Col. Steve Austin on “The Six Million Dollar Man.” It was pure science fiction then, but fast forward to today — artificial hips, titanium knees, and microchipped employees are no longer a pipe dream. To paraphrase the Rolling Stones, “the change has come, [it’s] under my thumb.” Even so, I’d have to think long and hard before going down that road. For now, I’ll hang onto my car keys and credit cards, and try to remember my latest passwords.
About the AuthorMore Content by Tami Simon