<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	
	>
<channel>
	<title>
	Comments on: Face of the Future!	</title>
	<atom:link href="https://www.humintell.com/2011/07/face-of-the-future/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.humintell.com/2011/07/face-of-the-future/</link>
	<description>See what you&#039;ve been missing</description>
	<lastBuildDate>Mon, 19 Sep 2011 01:06:35 +0000</lastBuildDate>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.7.2</generator>
	<item>
		<title>
		By: Keith D.		</title>
		<link>https://www.humintell.com/2011/07/face-of-the-future/comment-page-1/#comment-12083</link>

		<dc:creator><![CDATA[Keith D.]]></dc:creator>
		<pubDate>Thu, 21 Jul 2011 07:20:38 +0000</pubDate>
		<guid isPermaLink="false">http://www.humintell.com/?p=7712#comment-12083</guid>

					<description><![CDATA[As much as I generally abhor government regulations, it may be (past) time for the government to step in and create some boundaries as to how this kind of data mining can be used and who should have access to it. One might even argue that a person&#039;s emotional state (as detected by a remote computer system as part of a service provided by a company etc.) is a part of that person&#039;s mental health, and as such its use is already protected by HIPPA. That would be an interesting tack for some privacy-loving civil libertarian to take on it. It might be wise for large companies to step very cautiously into this new technology while people hammer out how it should be used safely.

What does everyone else think about this? Is your emotional state (when not explicitly shared on a per-instance basis) private? And more pertinently, are automatically detected measures of your emotional state akin to surreptitiously gaining access to your mental health records? Should it be illegal for businesses to mine, gather, index, correlate, sell, or use etc. that information for their own purposes?]]></description>
			<content:encoded><![CDATA[<p>As much as I generally abhor government regulations, it may be (past) time for the government to step in and create some boundaries as to how this kind of data mining can be used and who should have access to it. One might even argue that a person&#8217;s emotional state (as detected by a remote computer system as part of a service provided by a company etc.) is a part of that person&#8217;s mental health, and as such its use is already protected by HIPPA. That would be an interesting tack for some privacy-loving civil libertarian to take on it. It might be wise for large companies to step very cautiously into this new technology while people hammer out how it should be used safely.</p>
<p>What does everyone else think about this? Is your emotional state (when not explicitly shared on a per-instance basis) private? And more pertinently, are automatically detected measures of your emotional state akin to surreptitiously gaining access to your mental health records? Should it be illegal for businesses to mine, gather, index, correlate, sell, or use etc. that information for their own purposes?</p>
]]></content:encoded>
		
			</item>
	</channel>
</rss>