Technology – Ideas in Motion

Ideas in Motion — thoughts about new technology, future-looking ideas, context-awareness, cloud computing and predictive analytics.

 

Recent Blog Posts on this topic: 

Balancing Life in Writing and Tech

Posted by on Jun 16, 2017 in lifehack, tech, technology, Uncategorized, writing | 0 comments

I recently had the opportunity to talk at some length with an engaged set of students at Olympia High School. (I talk regularly to school groups, and if you’d like to book me to talk to your school group, you can contact my booking agent here.) We discussed both my high tech career as well as my writing, and found ourselves discussing technology, the future, and how one balances career aspirations as well as creative endeavors.   Here are some of the resources I shared with them after my talk, as “footnotes” to our conversation.   1) First, I spoke about removing distractions from my creative work and focusing. Here is relevant information about phones and other technology as intentionally addictive and distracting.   Addiction to Phones https://www.theatlantic.com/magazine/archive/2016/11/the-binge-breaker/501122/   How Technology Addicts You — from a Google Design Ethicist https://journal.thriveglobal.com/how-technology-hijacks-peoples-minds-from-a-magician-and-google-s-design-ethicist-56d62ef5edf3   Digital Distraction http://time.com/3616383/cell-phone-distraction/   2) Second, I spoke about Ira Glass (host of “This American Life”) and his emphasis on the experience of a younger, striving artist.   Ira Glass on Failure as an Artist http://jamesclear.com/ira-glass-failure   Here’s a great animated video with the same quote: https://vimeo.com/24715531   3) Third, here is a link to that study on “luck” that I mentioned   Is Luck Real? http://www.popsci.com/luck-real   More details on this study here: http://www.richardwiseman.com/resources/The_Luck_Factor.pdf   4) Fourth, I talked a bit about writing about autism. Here’s Temple Grandin’s TED Talk about the autistic experience.    Temple Grandin TED talk https://www.ted.com/talks/temple_grandin_the_world_needs_all_kinds_of_minds   5) Fifth, here are a number of articles about how liberal arts degrees make you more successful in the new economy:    Liberal Arts Degree is Tech’s hottest ticket (factual data) https://www.forbes.com/sites/georgeanders/2015/07/29/liberal-arts-degree-tech/   Fortune 500 CEOs with Liberal Arts Degrees (factual data) https://www.thealexandergroup.com/blog-press-room/2014/major-humanities-ceos/   Why Tech Needs Liberal Arts Degrees  (factual data) https://www.fastcompany.com/3034947/why-top-tech-ceos-want-employees-with-liberal-arts-degrees   The Future of Work (an opinion piece) https://www.insidehighered.com/views/2010/10/01/greenwald   6) Sixth, here’s a series of article that provide discussion of the coming wave of job losses due to high tech innovation and AI in particular.    AI and Unemployment http://www.economist.com/news/special-report/21700758-will-smarter-machines-cause-mass-unemployment-automation-and-anxiety   Robots and Work https://www.brookings.edu/wp-content/uploads/2016/06/robotwork.pdf   Jobs Disappearing https://qz.com/895681/silicon-valley-is-right-our-jobs-are-already-disappearing-due-to-automation/   AI and Jobs http://www.newsweek.com/2016/12/09/robot-economy-artificial-intelligence-jobs-happy-ending-526467.html   7) Seventh, my current book-in-progress covers AR/VR futuristic scenarios. Some of the references I’ve used include the following:   Discussion of Augmented Reality and Virtual Reality http://time.com/4761298/augmented-reality/   Another perspective from Facebook — http://mashable.com/2017/04/19/facebook-f8-augmented-reality-socially-acceptable/#7FCQFQyxEmqc   8) Eighth, I mentioned my work in high tech. Here are some relevant videos and highlights that might be of interest.    Google I/O – Ned Hayes on the show floor doing a Drone Demo http://nedhayes.com/drone   Intel Developer Forum, on stage with EVP Doug Fisher https://www.youtube.com/watch?v=DX9wP7ZhAOY   GigaOM Mobilize https://www.youtube.com/watch?v=jPVP9k2TvJI   Intel TV Interview http://bit.ly/2rySlze   Early Mobile Phone interface Demo (it’s very cheesy, but please note that this was before the iPhone and Android went big — http://bit.ly/2sAEM2z ) I led the UI design and demo team for Paul Allen.   Oh, and here’s some information about that iPhone innovation I mentioned, that Paul Allen’s team invented: http://nedhayes.com/inventing-the-future-everything-old-is-new-again-iphone-6s-edition/   The following information about Silicon Valley is interesting and pertinent. http://www.businessinsider.com/silicon-valley-history-technology-industry-animated-timeline-video-2017-5   And here’s a quick Seattle Times article that covered my commute to Silicon Valley http://nedhayes.com/trains-planes-and-seattle-supercommuters/   And finally, I mentioned several additional books during the course of our conversation. Here are some of the relevant books.   AMUSING OURSELVES TO DEATH, by Neil Postman https://www.amazon.com/dp/B0023ZLLH6/   Steve Silberman NEUROTRIBES http://stevesilberman.com/book/neurotribes/   Temple...

read more

Intel’s Next Generation Drone Platform with the Intel® Context Sensing SDK

Posted by on Jun 18, 2016 in context awareness, geekdad, Intel, tech, technology | 0 comments

Intel’s Next Generation Drone Platform with the Intel® Context Sensing SDK

The Intel® Context Sensing SDK surfaces the power of the Intel Integrated Sensor Solution and advanced sensor capabilities to power mobile, laptop and wearable apps. What if the Intel Context Sensing SDK could be used to understand a drone’s environment and even control a drone?  That’s the vision we set out to discover in this demo at Intel® Day Zero Google I/O in 2016. Our demo at the Intel Google I/O Day Zero Pre-Party demonstrated the integration of Intel’s Context Sensing SDK with Intel’s next generation drone platform for the collection and analysis of telemetry and environment sensing data. The demo highlighted how data collected by individual sensors on individual drones can be aggregated and processed for different use cases including authentication and control, environment understanding and proactive maintenance, as well as telemetry for thousands of devices stored and analyzed in the data center. Demo includes the following: Context Demo Dashboard demonstrates collection of sensor data across many drones and devices Authentication into “launch” of a drone via multi-device authentication, sensor state recognition and soft sensor inputs (calendar, time of day, etc.) Drone Dashboard demonstrates collection of sensor data across many drones and devices Drone behavior is also controlled as well by pre-set rules engines running both in the cloud and on the drones themselves Machine learning algorithms both on drone devices and in the cloud monitor thousands of drones and demonstrate how drones can “learn” from their environment and from the inputs given to them via sensors, etc. The Sensor SDK used for Drones can be used to provide SIMILAR benefits on your smaller device and laptop 2:1. All the Algorithms we use on drone sensors are cross-compatible on Intel’s Context Sensing SDK for wearables, phones and laptops / 2:1 devices. Get more information about the Context Sensing SDK here >> Please like &...

read more

Intel in Shenzhen — Context Sensing SDK

Posted by on May 1, 2016 in Intel, tech, technology | 0 comments

Our team has delivered a useful set of context-aware tools that can be seen in several games and both Android and Windows apps on the show floor at IDF Shenzhen. You can get the “Intel Context Sensing SDK” toolset here: http://contextsensing.intel.com/ And here’s some more information about the Intel Context Sensing SDK. Context Sensing SDK surfaces the power of Intel’s sensor hub and advanced sensor capabilities to power mobile and Web apps, thru standard RESTful APIs, dynamic algorithms and state machines, married to Intel’s next-generation Intel Integrated Sensor Solution for Android and for Windows, with all your data aggregated to a world-class managed cloud service. Context Sensing SDK is a library available for both Android and Windows that helps you easily incorporate services and context aware capabilities in your mobile and desktop applications. The SDK is flexible, offering several methods to use the services, either independently or in combination. The SDK includes Context APIs, which are useful to create context aware applications by taking advantage of many built-in context types providers. What is more, the SDK includes a Rules Engine that enables you to create rules based on the context and trigger actions once the conditions are met. You can download and use the Intel Context Sensing SDK here >> Please like &...

read more

God and the Robots

Posted by on Oct 21, 2015 in posthuman, technology, theology | 0 comments

I have in mind writing a non-fiction book called God and the Robots. Here’s a first look at my early book proposal.  ——————————————– In the summer of 2015, two stories were on the front page of the New York Times. The stories did not appear to be related. The first was an announcement that a group of technical luminaries – including Elon Musk and Stephen Hawking – had signed a statement calling for governments to outlaw the building of autonomous killing weapons, or so-called “Terminator” machines.[1] The other was a story about a well-meaning robot called “hitchBot” being destroyed by vandals while hitchhiking across the United States.[2] The first story was about how intelligent machines should treat us. The second story was about how we should treat intelligent machines. Both stories turn on an unanswered question – what is our ethical relationship to robots? How should we treat each other? The proposed nonfiction book God and the Robots provides an answer to this important question, from an author with intimate knowledge of current and future machine intelligence, as well as a deep understanding of theological ethics and America’s long engagement with Judeo-Christian ethics. Intel futurist Ned Hayes is an expert in artificial intelligence; he also happens to be a published theologian who has worked on ethical questions for several decades. He thus has a uniquely informed perspective on ethical interactions with machines. With strong endorsements from leading thinkers across the robotics spectrum, God and the Robots is an important and timely update on a topic that has a broad and actively engaged readership, but whose leading New York Times bestselling books are all five to ten years old. God and the Robots fills this market need with an original and deeply informed new book on robotics, ethics, and AI. Why Theology  & Robots In 2015, nearly 80% of Americans claim some affinity with Judeo-Christian conceptions of God; a strong majority of U.S. adults say they believe in God, heaven and the soul.[3]  What will happen to those baseline beliefs when self-aware robots begin to assert themselves? In fact, what is a robot in terms of theological understanding? Does a robot believe in God or have a soul? Today, these questions may not seem to matter. But very soon, when your 90-year-old grandmother is being cared for by an always-on and seemingly highly compassionate near-sentient robot, people will begin to ask these questions every day. If your grandmother begins to believe that the robot prays for her, and she cares for the robot as a “person,” then these questions will have immediate resonance. And what if the robot does actually pray for her? What happens then? What will your family think of that robot, and of that grandmother? These scenarios may come to life within the next five to ten years. Readers living in this generation of human beings will need to engage with robots on a daily basis. And they will need answers to critical questions of machine en-soulment, human differentiation and ethical behavior towards robots. This book sees the coming AI sea-change as a way to extend America’s Judeo-Christian ethic to embrace robotic beings.[4] Finally, one of the few nonfiction categories still growing dramatically is the “Religious”[5] book category: another category still above water is “Technology.”[6] God and...

read more

Inventing the Future: Everything Old is New Again (iPhone 6S edition)

Posted by on Sep 14, 2015 in tech, technology | 0 comments

I’ve been really excited to see new innovations in interaction with phone, mobile device and wearable interfaces lately. Some of these innovations are doubly exciting…. because I helped invent them, seven years ago, and these new ways of interacting with data and with devices are only now coming to the mass market. For example, it was fascinating to have Walt Mossberg trumpet the praises of the iPhone 6S this week at Apple’s launch event. Here’s what Mossberg said: Anyone who thought there was no more fundamental innovation to be wrung out of the smartphone is just wrong. The 10-finger multi-touch interface made mainstream by the iPhone 8 years ago has now taken a leap forward with Apple’s 3D Touch. This lets you view content in apps without opening them, quickly perform common actions, and generally manage your smartphone more smoothly by simply pressing a bit harder than usual on the screen. In brief use so far, I found this both highly useful and delightful — exactly the reactions Apple loves best. I expect to use it many times a day. Exactly right. And my stellar team at Vulcan Labs (under the Kiha/ARO product name) invented most of these user experience innovations…. six or even seven years ago. You see, innovations like this don’t emerge one time or in one place. They percolate around the edges — and in the Labs — of various teams of collaborators for many years before finally, someone gets the budget, the right partners, and the clout to bring them to market in a fully productized and completed product. Here are some examples: Flipper of multiple “Tabs” in Apple iOS Safari   “Force Touch” / 3D Touch to expose contextually relevant menus   Smart contextual information based on in-line text recognition Let’s go thru the details on these “new” innovations, point by point. 1) Flipper of multiple “Tabs” in Apple iOS Safari (You might note that I went to Whitworth for my undergraduate degree… guess who got to write the copy for the Flipper?) Right now, they are using this only on browser pages on the on-device Safari browser. I wouldn’t be at all surprised if they eventually got around to our implementation model, where every recent app on the device can be browsed in this flip-book fashion.Here’s the Apple version, side by side with our original version, as available in press ready presentations since 2009 or thereabouts.   2) “Force Touch” / 3D Touch to expose contextually relevant menus Imagine that you could touch LONGER in a certain place on the screen, or have the device recognize that you needed information relevant to that place on your screen or your text. Well, we did that. In 2009. Happy to see it in broad production on an Apple iPhone 6S in 2015.             3) Smart contextual information based on in-line text recognition   Imagine that your device might know where on your screen or where in your text you were, and provide contextually relevant information, as a blossoming “menu” available at a touch. Yes, we did that, in 2009. Happy to see it on the iPhone 6S in 2015. I’m very happy to have been part of extremely collaborative and innovative teams at Vulcan Labs and Xerox PARC who invented many ideas similar to those...

read more

IDF 2015 (Intel)

Posted by on Aug 21, 2015 in context awareness, Intel, tech, technology | 0 comments

Intel created an awesome week with at IDF 2015, Aug 18-20. We had a fantastic session with the new Intel® Context Sensing SDK for Windows – newly released this week. Here’s a great short summary video of all the fun at IDF 2015! Coverage includes announcements from this years Intel Developer Forum: Intel® RealSense™ technology extends into new platforms and devices; ‘America’s Greatest Makers’ reality competition announced, premiering in 2016; 3D XPoint™ comes to storage and memory solutions with Intel® Optane™ Technology, and much more. Visit http://www.intel.com/newsroom/idf for more. Please like &...

read more

Ned Hayes – Media Reel

Posted by on Jul 25, 2015 in Intel, tech, technology | 0 comments

NED HAYES Market Strategist, Futurist, Author   Media Reel includes excerpts from interviews as a core product leader at Intel, Xerox PARC, and excerpts from author interviews with BookNote, as well as a reading at Rainier Writing Workshop 2015.       Please like &...

read more

Intel in Shenzhen – Context Sensing SDK

Posted by on Apr 3, 2015 in cloud, context awareness, Intel, tech, technology | 0 comments

Intel in Shenzhen – Context Sensing SDK

I’m very proud of what my smart team at Intel has accomplished over the last year, and I’m excited to see our tools demonstrated at the Intel Developer’s Forum (IDF) in Shenzhen 2015 this week in China. Our team has delivered a useful set of context-aware tools that can be seen in several games and Android apps on the show floor at IDF Shenzhen 2015. You can get the “Intel Context Sensing SDK” toolset here: http://contextsensing.intel.com/ And here’s some more information about the Intel Context Sensing SDK. Context Sensing SDK surfaces the power of Intel’s sensor hub and advanced sensor capabilities to power mobile and Web apps, thru standard RESTful APIs, dynamic algorithms and state machines, married to Intel’s next-generation Intel Integrated Sensor Solution for Android, with all your data aggregated to a world-class Mashery-managed cloud service. Context Sensing SDK is an Android library that helps you easily incorporate services and context aware capabilities in your Android applications. The SDK is flexible, offering several methods to use the services, either independently or in combination. The SDK includes Context APIs, which are useful to create context aware applications by taking advantage of many built-in context types providers. What is more, the SDK includes a Rules Engine that enables you to create rules based on the context and trigger actions once the conditions are met. You can download and use the Intel Context Sensing SDK here >>     Please like &...

read more

Runcible – Monohm at MWC 2015

Posted by on Mar 4, 2015 in context awareness, geekdad, Intel, lifehack, software, tech, technology | 0 comments

Runcible – Monohm at MWC 2015

I am so impressed with the craftsmanship and creativity shown by the Monohm team with their new “heirloom device,” the Runcible. Even in the first generation of this smartphone, the device no longer feels like a technological interloper on your life. Instead it feels like a natural part of your daily wardrobe, part of your life, and part of your family experience. This is what a “next generation wearable device” should feel like: it should be organic and beautiful. The Runcible device is a round smartphone that fits readily in your pocket, is crafted with care (both software and hardware), and delivers a personalized experience that does not interrupt your life, but instead accentuates what you really care about in your daily experience. I cannot wait to become an early adopter of this remarkable device. (with thanks to my old colleague George Arriola for being part of the team that developed this break-through device!) Pictures from my hands-on moments with the Runcible device at Mobile World Congress 2015.  Please like &...

read more

Mobile World Congress 2015

Posted by on Mar 4, 2015 in context awareness, Intel, tech, technology | 0 comments

Mobile World Congress 2015

A few photos from my recent week-long demo adventure to support Intel’s Context Sensing SDK and Integrated Developer Experience at Mobile World Congress in Barcelona (March 2015). Please like &...

read more

Enjoy what you read? Share!