Wednesday 1 April 2015

documentary of humanoid robots


sony music player


technologies in military



mobile technologies in 2015

Take a step back from your iPad, iPhone, Galaxy, or whatever for a moment. What you hold in your hand today should undergo serious improvements in 2015, given the groundwork laid in 2014. For some people, taking advantage of those improvements will mean getting new devices, but many current device owners -- especially those who bought Apple's latest models -- will access them in what they already own.
1. 64-bit appsiOS 7 debuted with the 64-bit Apple A7 processor in theiPhone 5siPad Air, and iPad Mini with Retina display. Apple'sXcode 5 IDE allows creation of 64-bit apps from existing code, so the iOS world will see 64-bit apps become common in 2014. As with the transition to 64-bit apps in Mac OS X Snow Leopard, most apps won't really take advantage of the greater processing and memory capabilities in their first 64-bit versions, both because developers won't have figured out how to get the maximum effect in the first go-round, and because they won't want the 32-bit versions of their apps used on older devices to be radically inferior until enough of the market has 64-bit devices.
[ Bob Violino and Robert Scheier show how businesses today are successfully taking advantage of mobile tech, in InfoWorld's Mobile Enablement Digital Spotlight PDF special report. | The best office productivity tools for the iPad andfor Android tablets. | Keep up on key mobile developments and insights with theMobilize newsletter. ]
After Apple debuted the A7 in September, several Android smartphone makers said they too would ship 64-bit devices, likely using a recent ARM reference design . But that won't do much for them until Google has a 64-bit version of Android to run on it. Expect that in the second half of 2014, giving iOS nearly a year's lead time in 64-bitness.
2. Spatial sensitivityMotorola Mobility debuted its X8 motion coprocessor  in the Moto X this year, and Apple followed up with its own M8 motion coprocessor in the iPhone 5s, iPad Air, and Retina iPad Mini. You won't find these coprocessors in a PC, which tend not to be used in motion or with devices that move. But smartphones and tablets are used on the go and for items in motion, such as fitness monitors and navigation devices.
A motion coprocessor will make it easier for mobile devices to incorporate tracking of their own motion as well as that of peripherals into their computing. Using a coprocessor means there's less drag on (and power usage from) the main processor, so apps that use spatial sensitivty derived from motion can run all day -- even when the device is asleep. If you use GPS on your mobile device and see how it burns through your battery in minutes, you know avoiding that drainage is critical to making it a capability you'll leave turned on.
As motion processing is built into more devices, apps and peripherals that can take imaginative use of them will proliferate -- it's not just for runners and those trying to lose weight. Again, the iOS world will have a good year's lead time on this technology because motion processing is now standard in all new Apple devices, whereas only Motorola and parent company Google have it (so far) in the Android world.
3. BeaconsThey're in your neighborhood Apple Store , and they're coming to sports stadia, shopping malls, and perhaps downtowns. These little devices use Bluetooth to communicate with your mobile device and a Wi-Fi or Ethernet connection to connect to the Internet, serving as an information waystation. That may sound like just a Wi-Fi access point, but it's not -- in fact, beacons aren't access points at all.
Instead, they're location-specific points of contact. That means they serve a small area -- Bluetooth's roughly 30-foot range -- to provide custom interaction related to that specific area. For example, a walking tour, zoo, or museum could use them to know what you're looking at and provide links to relevant details or to play an audio or video for that tour segment. A stadium could use them to know where you are so that the food you ordered gets to you faster or to tell you the nearest restroom's location. A store's online help or inventory system would know what department you're shopping in.
Beacons don't require interaction, of course -- they can simply record the Bluetooth network addresses of devices that come in range to build a model of foot traffic, where people tend to linger, and so on, all of which would be of great interest to retailers, urban planners, and police. But the interesting applications for individual users will involve websites and apps that interact with beacons to know where you are, then customize content and services accordingly. There's a lot of potential for innovation  with beacons, as well as potential for marketing and other privacy abuses.
Apple is the power in beacons technology -- its iBeacons technology is in every iOS 7 device. iBeacons even lets iOS devices act as beacons (all the retailer iPads and iPod Touches now have a new use). But several companies sell stand-alone beacons, as well as beacons protocols and services that can be used in apps across multiple platforms. Some of those also use Apple's iBeacons protocols, of course.
Because Apple has by far the broadest beacon-capable user base, expect it to be the center of gravity for this technology. Again, expect Google to introduce a similar set of APIs and OS-level hooks in Android at some point.
4. MiracastIn March 2008, Apple reworked its failed Apple TV device to be a stand-alone media streamer for both local (iTunes) content and online (iTunes Store) content. In September 2010, Apple reworked its little-used AirTunes technology as AirPlay, allowing iOS and OS X devices to wirelessly stream video and audio content to the Apple TV and licensed AirPlay speakers. The combination of AirPlay and the Apple TV revolutionized media consumption, letting computers and mobile devices stream content to a variety of playback devices, as well as receive (in the case of iPad and iPhones) media from other devices. The technology has also gained traction in some businesses for conference room presentations.
But in the rest of the technology world, media streaming is a mess. The Android world has three types of physical video connectors in use (MHL, MiniHDMI, and Mobility DisplayPort), as well as two video-streaming technologies (DLNA and Miracast). Windows 8 uses WiDi, Intel's Wireless Display technology built into its current graphics coprocessors. Amazon.com's new Kindle Fire HDX tablets support Miracast.
Miracast promises to change that, though it had a rough start in 2013. Backed by the Wi-Fi Alliance that rendered the once-messy 802.11 protocols interoperable, Miracast is meant to make wireless video streaming interoperable across computers, mobile devices, and entertaiment devices like stereos, TVs, and speakers. Although Intel's WiDi incorporates the Miracast standard, many Windows PCs need driver updates to get Miracast support to actually work. The Kindle Fire HDX is certified with only one Miracast device, the Netgear Push2TV -- undermining the interoperability promise of Miracast. So far, only Google's and subidiary Motorola Mobility's recent Android devices support Miracast.
2014 will be the year that Miracast pulls together and delivers on its promise -- or follows the fate of DLNA, the clunky standard introduced in 2003 that often fails when mixing devices from different manufacturers and thus flopped in the living room.
5. MBaaS
It's one of the ugliest terms of tech today, and its meaning is highly variable and confusing, but mobile back ends as a service (MBaaS) is both increasingly important to developers and, I believe, about to go through a major shift. Forrester Research had a good explanation of MBaaS in 2012 when the term began to proliferate: middleware to data management and authentication services that mobile apps would need if the apps were part of a deeper data-driven service. Today, MBaaS is used to mean almost any cloud-resident service an app may need access to, such as video rendering, payment processing, location information lookup, and ad serving.
One sales pitch for today's expansive "any services" version of MBaaS is that mobile devices have too little processing power and storage capacity to do "real" computing, so they need an assist from the cloud. That's not true with the Apple devices and high-end Android devices from the last few years, of course, but it's true that tapping into the cloud provides an almost limitless set of capabilities that developers can use rather than re-create, allowing them to weave together more function
If you use Chrome OS, the chrome browser, or Windows 8's Metro side -- or Web apps in general -- you already see thatthe mashup notion that briefly shone in the late 2000s is alive and well, but without that name. MBaaS is now effectively a services offering for functionality, rather than apps or infrastructure, that app developers will pull together no matter what devices their software runs on.
That's where I believe the MBaaS shift will occur in 2014. The "M" part will go away because the same logic applies to desktop and Web apps, too. The "B" part will also go away, because the notion of a back end is too confining and assumes a central data center model when in fact services (like APIs) will come from multiple sources and be federated. It's really just services, and they will enrich mobile apps even more.
In other words, MBaaS is going to yield to simply cloud-delivered services. That's why Software AG bought former mashup king JackBe, eBay bought PayPal, Facebook bought Parse, Salesforce.com's Heroku unit partnered with AnyPresence, and Google and Microsoft offer MBaaS functionality in their platform and infrastructure services.