Big Data enables context-aware Security


Thumbnail image for IS-Security.jpgEnterprises are increasingly required to open and extend their network boundaries to suppliers, partners and customers to support innovative value chains and information collaboration. This scenario, along with more corporate applications being accessed over the cloud & mobile devices, makes firms vulnerable to sophisticated security threats. Big Data Analytics (BDA) applied to enterprise security promises to bring a new level of intelligence to network forensics and risk management. Information security will be more intelligence-driven, contextual and risk aware in real-time. Gaining insights into what big data is telling us about security threats is the hard part. Collecting the data is the easy part. BDA frameworks, along with reductions in infrastructure costs for data warehouses generate massive clusters of computers that can be managed efficiently, and with fewer people. These economics will disrupt traditional monitoring, SIEM (Security Incident & Event Manager), identity management and governance, risk & compliance (GRC) in the field. Contemporary SIEM devices do aggregation & correlation at roughly thousands of events per second. More sophisticated security management platforms that are BigData-enabled should be able to process millions of incident events per second with the same hardware footprint. Historically you've had to do significant filtering, factoring and reduction of security data to get to a manageable size that allows security professionals to perform analysis and make decisions. Now, being able to mine petabytes of operational and security risk data from diverse data sources will provide actionable intelligence in real time. It is expected this "mining" can be done with industry standard 3rd party applications through open-source methods. BDA enables highly efficient batch processing to analyze historical data, find out when the attack started, how initial probing went undetected and how the attacker breached your system. The use of big data analytics in an enterprise security context provides for situational awareness, automates the threat detection processes, improves reaction times and will ultimately help with prevention. Watch for startups innovating in this space.

Thumbnail image for Thumbnail image for iStock_biotech1.jpg

I've been studying crowdfunding and how the new federal JOBS act will attempt to allow non-accredited investors access to seed rounds in early stage startups. Once limited to artistic endeavors, charity and filmmaking, the concept has grown from the likes of Kickstarter and Indiegogo to more prevalent in equity investment circles. According to, there are nearly 1,000 crowdfunding sites in existence but until the SEC enacts Title III of the JOBS ACT, we won't see the new equity crowdfunding portals provided for by the law - not yet. One capital-intensive area, biotechnology, won't see this type of funding as a replacement for traditional venture capital anytime soon. According to Scott Jordan from HealthiosExchange, the average successful biotech company raises $49 million over 5.7 years through a series of private equity rounds. I agree with his assertion that crowdfunding would help these firms achieve milestones during the seed stage that will ultimately get VCs interested. There are already sites connecting a wider range of accredited angel investors and allowing them to syndicate with each other thereby taking more positions in a portfolio of biotech startups. Diversification and "failing fast" is tremendously important in life sciences development and research.

Big Data Museums need Human Curators

Thumbnail image for Orlando-Museum-of-Art.jpg
Most analysts define "Big Data" subjectively as information datasets whose size is beyond the ability of mature software tools to capture, store, manage and analyze it. As people and business go on about their lives, they generate huge data exhaust as a by-product of social media, smartphones, computing and embedded devices. Since it is very hard for machines to pull operational insights out of big data, there is a rise in the need for data scientists often referred to as data "curators." Much like a museum curator collects, catalogues, interprets and preserves artwork or historic items, a data curator works to improve the quality of data-driven information within their operational processes. This also involves active lifecycle management that attempts to connect the sciences, social sciences and humanities. Even though there are programs that can poll APIs for AWS or GitHub and pull out somewhat structured data, it cannot be fully interpreted without human intervention. This is good news because with our newfound tools, people will transform the study social sciences to more digital humanities where insightful connections are made to economics, law, medicine, education and communication.

Enhanced by Zemanta

A Spring Awakening ... for Java EE


Thumbnail image for SpringSourceVMLogo.pngMuch has been written about the severe Amazon EC2 outage last week. This made me think about the necessary tools needed for deploying high-availability applications in a cloud environment. Java Enterprise Edition is very complex to use but remains the popular choice among enterprise application developers today and it has a huge installed base needing some form cloud readiness. Application platform frameworks, like Spring, provide the runtime middleware container for both custom or packaged applications that run on a cloud service like PaaS (Platform-as-a-Service). Features of programming languages like Java, C#, C++, Ruby or Python; can be extended at runtime by APIs, embedded declarative clauses or metadata patterns provided by a framework like Spring. Optimal allocation of system resources (memory, threads, connection pools etc.), quality-of-service (reliability, availability etc.) and connectivity (messaging, networks & databases) are managed on behalf of the application by the framework. With the huge investment in Java code today, many firms are adopting Spring's Model-View-Controller (MVC) for web applications, plug-ins for the Eclipse IDE and the many web service add-ons available. The beauty of the framework and its relevance to cloud is that Spring separates all business logic from application (logical or physical) infrastructure. Now we can have a real application bus where you combine virtualized or non-virtualized applications with structured or unstructured data. Those applications will be exposed to both managed and unmanaged mobile devices (but that's another blog post!). We would build applications by taking blocks (objects) of code using Spring and populate the business logic using open source lifecycle tools existing outside of the cloud. Dynamic programming models like Ruby can be used as large-scale web-based front-ends with the power of Java EE under the hood extending the life of existing legacy applications in a painless way. With the framework in place, cloud advances in multi-tenant governance; horizontal scaling or cloud transaction processing can take place without causing major application reconstruction.


It is no secret that iPads are gaining traction in the corporate enterprise with its ideal form factor, weight and battery life. What Nicholas Carr didn't see with his iconoclastic report "IT Doesn't Matter," was how the tightening of the technology stack would be enabled by consumer endpoints capable of running any application from a data center or the cloud. I was interested to see this week's VMWare View 4.6 for the iPad joining the category of other apps like Citrix Receiver or the new Rackspace OpenStack admin app.  VMWare had to program custom gestures to blend the iOS experience with what most users often run on VDI - Windows. Now you can use two-fingers to right click and Apple-like drag & drop features to interact with Windows. Heck, you can even run Flash on the iPad. Unlike Microsoft's protocol, RDP, VMWare uses PCoIP, which transmits only changing pixels to stateless endpoints over the network.  Since the protocol can tunnel over HTTPS, proxies or firewalls don't block it. This enables virtual end-to-end tapping; finding your nodes without knowing their physical path. The iPad used as a VDI device helps Microsoft defend their presence in the enterprise since the access device would satisfy users and may reduce the complaints about enterprise application user experience. "The sun shines, and people forget; the spray flies as the speedboat glides and people forget...," people forget how many Windows workstations are still out there.

Near Field coming Near You


Even though Near Field Communications (NFC) has been around for 15 years, it could become mainstream in the U.S. smartphone market this year. NFC operates at 13.56Mhz and at speeds from 106 kbits/s to 848 kbits/s all within a 4 cm range. We are finally catching up with Japan (e.g., Osaifu-Keitai system) and other areas of the world where NFC is used for mobile commerce and payments. With better software integration, you now have the intersection of context, proximity and event handlers that blend the physical and virtual worlds. It would make sense if Google announced a mobile payment platform since NFC is natively supported in Android 2.3. You have to consider other players with a little more "trust" than Google such as Apple iTunes or even Paypal. Merchant players like First Data or GPN are reluctant to adopt an offering that is not industry standard. MasterCard and Visa have made progress raising consumer awareness for NFC but financial institutions are not good catalysts for ecosystems. Even though NFC silicon can be standardized, individual competitors that bring their own implementation of payment systems can stall adoption and create payment silos. The battle will be which model will prevail - operator-centric, bank-centric, collaboration-centric or peer-to-peer-centric. Perhaps it doesn't matter since once a user has selected a smartphone platform, they automatically get the mobile payment system. Otherwise we would need a system like "payment roaming" similar to what evolved during the early expansion of cellular networks and billing systems.

Why two might still be better than one


I've been following some recent discussion about mobile virtualization. One article by Alex Williams at ReadWriteWeb caught my attention. I agree that many people carry two smartphones today, one for business and one for personal use. It's true that mobile processors lack virtualization support at the hardware level and manufacturers would have to pre-load this type of functionality. I don't agree that virtualization will drive more downloads of apps onto a single device with dual partitions however. It has more to do with the change in application frameworks rather than optimizing bare metal VMWare hosts in mobile devices.  You wouldn't be able to run Android and iOS4 on the same device anyway.  You might want RIM Blackberry for business and Windows Mobile 7 or WebOS for personal apps, for example. Developers are getting tired of the multi-platform treadmill for keeping various versions of mobile apps up to date.  You create a "rich" web experience app using the latest HTML5 standards first and then wrap native code around it for the downloadable version. More advancement in local cache storage will alleviate the bandwidth demands too. This way your users to get the same look and feel and predictable UI behavior, no matter if it's downloaded from an app store or running in the mobile browser. 

Only Half of your Heart gets H.264 or WebM


The HTML5 Video wars have settled into two camps. Microsoft and Apple support H.264. This video codec was recently freed by the MPEG LA but only for video free to view by end users ("Internet Broadcast AVC Video"). Google open sourced its VP8 video in May under the WebM open Web media project with a BSD-style, royalty-free licenseĀ­. Mozilla and Opera support WebM. I think H.264 is a short-term solution for Apple given the fact that the MPEG LA can change the fee structure in 2016. There is a possibility in the mean time that a pass-through fee could be imposed for protected video content running over iTV. Apple needs to get moving quickly on the follow-on to H.264 - HEVC (High Efficiency Video Coding, aka H.265). HEVC aims to reduce bitrate requirements by half through increased computational complexity. Targeted at next generation HDTV systems with progressive scanned frame rates and scalable from QVGA to 1080p, it will fully replace the H.264. Apple also needs to make sure that its processors will be able to handle the future compression (3x or more) while preserving battery life and reducing device heat dissipation. Video standards can't be "free to roam or make a home out of everywhere they've been." It's too costly for content creators to publish to conflicting standards.

Video Market Rubric is a Rubix Cube Puzzle


A rubric is a scoring tool for subjective assessments. Tech-savy consumers have a natural ability for assigning rubrics when purchasing entertainment or media. The trade-offs in accessibility, content quality and cost have become problematic in $69.8 billion U.S. TV subscription market. Apple is expected to introduce the new iTV service platform during their event at the Yerba Buena Center in San Francisco this week. Internet online video did to cable TV what cable did to network TV. The industry has accelerated the unbundling of content from its transport. It's happened in music, newspaper publishing and book publishing.  While the Comcast/NBC merger continues to come under FCC scrutiny, Apple is negotiating with ABC, CBS and Fox for 99-cent streaming TV show rentals. Google is also negotiating with Hollywood for Pay-per-View service on YouTube. On the fringe, you have Xbox and Hulu. Microsoft recently increased its Xbox subscription rate, and why not? Hulu is trying to sell a $9.99 monthly subscription (with advertising) to its users. The consumer can assess their needs layer by layer, just like solving the Rubix cube. Once you decide on the content, you can decide how best to consume it. Leveraging existing in-home appliances like a Blue-Ray player or Xbox, provides the Codec, streaming processor & local cache needed for HD 1080p. People will not want throw away their existing investment, despite what Apple or Google may do. 

I've come to the conclusion that RIM suffers from legacy compatibility, much like Microsoft has experienced. Morgan Stanley analyst Ehud Gelblum forecasts RIM's global market share to decline from 16% to 13% by 2012. Now that RIM is expected to launch the Blackpad with QNX versus Blackberry 6, they are accepting the limitations of Java ME and want to drop the baggage of legacy code for older Blackberrys.  QNX is a real-time Posix OS that is popular with automobile, industrial & medical applications. Since it powers BMW navigation systems & Porsche 911 "acoustic processing," I'd like to see what RIM would do with it. The Google Chrome OS table from HTC & Verizon is expected to come out the day after Thanksgiving - OK but please hope it's a better launch than the Nexus One. CIO Magazine said the Dell Aero is "an embarrassment to Android."  The Aero runs 1.5 Android, so I'd have to agree with them. We'll have to wait until early next year to see HP's Hurricane based on the webOS from Palm. In the meantime, keep a lookout for the Samsung P1000 Galaxy Tab. It features a 1 GHz ARM processor with Android Froyo, front/rear cameras and Adobe Flash.  Many Apple competitors hope to garner market share now that the iPad is firmly established as the category leader. I like that enterprise applications on the iPad are emerging like those from Bausch & Lomb and Mercedes Benz Financial. If Apple wanted to scorch the earth with tablet wannabes, they could just introduce a cost-reduced 7 in. iPad "Nano" at a $400 price point. I'd keep that club in the bag for now.

About Paul Lopez

Paul Lopez Paul Lopez is a 20+ year technology veteran whose career has spanned multiple disciplines such as product management, software development, engineering, marketing, business development and operations... read more


Connect with Paul

Powered by Movable Type 6.0.2