Malware That Eavesdrops Via Computer Microphones Is Stealing Hundreds Of Gigs Of Data

Malware That Eavesdrops Via Computer Microphones Is Stealing Hundreds Of Gigs Of Data

7268904100_7e2d63e595_o

It’s very common to read about malware pilfering all kinds of confidential data from computers — spreadsheets, documents, databases, pictures. Researchers have uncovered a new malware campaign that takes things a step further. It’s recording audio near compromised systems by stealthily switching on the computers’ microphones.

 

CyberX, a security provider that specializes in industrial control systems (ICS) and the Internet of things, recently reported on an advanced threat called Operation BugDrop. The sophisticated malware, they say, has already exfiltrated hundreds of gigabytes worth of data. CyberX has tallied at least 70 organizations victimized by BugDrop so far. They range from a civil engineering firm to a human rights organization to newspapers.

 

So far, the bulk of the attacks have targeted operations in Ukraine. BugDrop has also spread to other countries, including Austria, Russia, and Saudi Arabia. CyberX notes that whoever is behind BugDrop has access to significant resources, given that the malware is siphoning several gigabytes of data every day — and that data must be decrypted before it can be analyzed.

 

When you’re talking about malware and significant resources, you’re often talking about a state-sponsored campaign. CyberX certain seems to lean that way, saying “while we are comfortable assigning nation-state level capabilities to this operation.” They’re careful to add, however, that “Attribution” is notoriously difficult” and that they “have no forensic evidence that links BugDrop to a specific nation-state or group.”

 

How Is BugDrop Being Spread?

 

It won’t surprise you to learn how a computer becomes infected by BugDrop. Like nearly every other strain of malware you read about today, BugDrop is being spread via phishing emails (just like recent attacks against Gmail and Paypal users). Office documents laced with malicious macros deliver the “dropper” which injects the actual malware to a victim’s computer.

 

So far, only a handful of anti-malware scanners detect BugDrop. That’s not great news, though security software never has to get involved if users are trained to recognize suspicious emails and resist the temptation to open shady attachments.

 

Article courtesy of Lee Matthews.

How to Jump Ship (And Not Get Eaten By Sharks)

How to Jump Ship (And Not Get Eaten By Sharks)

Jason Silver, CEO, CTI Consulting

Jason Silver

You collected your annual bonus in January and the check has now cleared your bank.  Your patience in staying with your current employer through the holidays has paid off.  Now, you are definitely ready to bid adieu to your over-bearing boss and that annoying guy in the cube next door in search of greener (no pun intended) pastures.

 

But, where do you start?  And, how do you navigate the job search process in such a way that you are able to leave on your terms and for the best opportunity?  Fear not grasshopper.  Below is your guide to landing your next position while avoiding the most common pitfalls.

 

Don’t Talk

The key to secrecy in your job search is to maintain a need to know policy.  Tell your recruiter, your mom, and your priest/rabbi if you must.  If you tell one person inside your company, even your BFF who takes a vow of secrecy, the news will spread like wildfire.  There will be plenty of time to share all of the gory details AFTER you have started the new role.  Case in point, during the writing of this blog, one of our clients found out that a new consultant was interviewing just 10 days into the contract.  His contract was unceremoniously ended on the spot.  The manager argued that she would rather end the contract on her terms rather than have him leave at a critical point mid project.

 

The risk:  Your boss finds out you are looking and your slow, methodical search for the perfect position becomes an emergency search for a replacement job after you are escorted out by security like a common criminal.

 

Don’t Post Your Resume

Piggy-backing on point one, it is best to keep your job search quiet.  Recruiters scour job boards all day and all it takes is one that wants to win brownie points with your boss by breaking the news that they came across your resume on the job boards.  Further, your phone will ring off the hook at all hours with pitches from recruiters who have barely read your resume regarding job openings that are not a fit.  If you do feel compelled to post your resume, make sure it is confidential which removes your name.  And, do not forget to remove your contact information from the RESUME section.  I cannot tell you how many confidential profiles I have seen only to find the person’s name smack dab at the top of the resume.

 

The risk:  Same as the first one with the added bonus of allowing a recruiter to throw you under the bus for their own personal gain.

 

Build Relationships With GOOD Recruiters

A few solid recruiters will know about pretty much every opening in your job market.  Further, they will keep your candidacy private.  If you establish good rapport with the recruiters that have some tenure in the industry, you will find that they will get to know you well as a candidate and will reach out to you proactively from time to time, even when you are not actively looking, to discuss opportunities that may be of interest.  More important than any of this, including the occasional free lunch they may buy you, is they will not waste your time.  A true professional recruiter will know which opportunities and companies may be a fit for you as well as the appropriate salary range.  These relationships can be beneficial throughout your career.

 

The Reward:  Make your search efficient by building relationships with a few of the best recruiters in your market.

 

Counter Offers

In general, it is a bad idea to entertain counter offers.  Sometimes, your employer will throw a truck load of money at you out of sheer desperation.  But, you can assume with confidence that the next move will be to search for your replacement.  Once you’ve put in notice, your employer’s trust in you will be irreversibly broken.  To this point, some studies claim that over 90% of employees that accept a counter offer are no longer with that company within a year.  Not to mention you have accepted an offer by this stage and reneging on this acceptance reflects poorly on you and may lead to burning bridges.  Remember why you are looking in the first place.  If it’s only about money, perhaps start by having that discussion with your current employer before starting your search.

 

The risk:  You accept a counter offer only to be let go when they find your replacement.  You also burn bridges with the recruiter and other company from which you had previously accepted an offer only to reverse course.

 

Conflict of Interest

Do not be surprised if the recruiter who placed you at your current employer reacts sheepishly when you contact them to find your next position.  There is an ethical issue in taking candidates away from companies with which the recruiter works.  Moreover, many contracts between recruiting company and employer have specific language that would make this behavior a breach.

 

The Take Away:  If you were placed at your current employer by a recruiter, have a candid discussion about their ability to help you with your job search.  From an ethics and contract stand point, they may need to sit this one out.

 

Be Serious

While it’s ok to window shop for new positions, be cognizant of the time dedicated to the interview process.  The recruiter is spending time to find you a new role and the potential employers are spending time interviewing you with the belief that you are a viable candidate.  I have seen some of the best companies NEVER re-interview a candidate that turned down an offer from them previously.  If you are dealing with the right companies and recruiter, they will always respect your time.  Make sure to reciprocate.

 

The Take Away:  Try to make sure  you are truly ready to make a move as opposed to just kicking tires before getting too deep into the process so as not to burn bridges.

 

Two Weeks’ Notice

It is always advisable to put in at least two weeks’ notice before leaving your current employer.  But, if you are in the middle of a key project or if your position may be particularly difficult to back fill, consider asking your current employer what will make the transition easier for them.  As much as a month is not uncommon.  Within reason, your new employer should understand and any push back from your future company could be a bad sign.

 

Take Away:  Always put in appropriate notice whenever possible.  Sometimes this is longer than two weeks.

 

Resume Formatting

I have seen every kind of resume imaginable.  I could write a book dedicated to just resumes.  In the interest of brevity, I will address one key point here pertaining to resume length/detail.  The days of the one page resumes are gone.  A proper resume in the IT field for example, assuming that it is not for an entry level candidate, is three to five pages.  I cannot recall getting complaints on resumes having too much detail, but certainly have had issues with resumes short on detail.

 

Take Away:  Resumes should be three to five pages in most cases and contain plenty of specific detail on previous roles and tasks.

 

Negotiation

This is one aspect that many candidates are not comfortable with; however, it is one of the most critical components of the interview process.  Many large companies have a fixed salary increase structure which means from where you start, it is a slow, methodical climb.  Strong recruiters can help with this, in particular for permanent positions where their interest is directly aligned with yours (i.e. the more you make the bigger their commission).  If a larger salary is not possible, I have seen more PTO on the table, an office instead of a cube, a work from home day, etc.

 

Take Away:  If you don’t try, you will leave money or additional benefits on the table more times than not.

 

Changing jobs can be stressful, albeit inevitable for most of us.  But, you can manage the stress and have a positive outcome by following some simple rules, implementing a solid strategy, and avoiding some common pitfalls.  By doing so, hopefully you will find the process relatively painless and avoid becoming shark bait.  Happy hunting!

WhatsApp Changes Everything With Its New ‘Status’ Feature

WhatsApp Changes Everything With Its New ‘Status’ Feature

By Palmy Olsen at Forbes.com

Whatsapp

WhatsApp is making a radical update to its app, turning it for the first time into a platform for passively consuming content, similar to the way people scroll through their Facebook or Instagram newsfeeds – and it’s a move that could finally usher in a money-making system like advertising.

 

WhatsApp’s new Status feature, being rolled out on Monday, will let users share photos, GIFs or videos overlaid with drawings, emojis and a caption that will be visible to selected friends for 24 hours, before disappearing.

 

If this sounds familiar, that’s because it’s exactly like Snapchat’s hugely successful Stories feature, launched three years ago, which lets users share similarly-ephemeral timelines.

 

The move probably shouldn’t be surprising. Facebook, which owns WhatsApp, saw its other social media property Instagram roll out a clone of Stories last summer, also called “Stories.” A spokesperson for Snapchat could not be reached for comment at the time of writing.

 

This represents a bigger shift for WhatsApp than it did for Instagram though, because it potentially heralds a very different way of using the app. Till now WhatsApp has been a utilitarian hub of activity: people go on there to simply type and read messages and type some more. Not scroll endlessly through streams of other people’s content.

 

Status will change that use case for the first time. It also potentially opens the door to messages from businesses, or rather, advertisers. WhatsApp said more than a year ago that it was looking at ways that businesses could send messages to its users in an unobtrusive and useful way.

 

That has always sounded like a tall order — businesses ultimately want to persuade people, not just inform them — and particularly difficult given the chatting system that’s at the center of WhatApp itself.

 

Facebook has been able to rake billions in revenue each quarter from advertisers precisely because it can insert their videos and photos into its content-heavy Newsfeed.

So far, attempts on Facebook messenger and elsewhere to invite “bots” from advertisers to chat to people has fallen flat – any success there needs smarter artificial intelligence behind it and so is probably some ways off.

 

WhatsApp may have experimented with bots in the hope that it didn’t have to go down the tried-and-tested route of displaying content; Koum and his co-founder Brian Acton have been vehemently against advertising on their app since their early days, but monetizing their app in any other way does sound almost impossible.

 

“As a utility, we’re focused on building features that will be used around the world by our 1.2 billion users,” a WhatsApp spokesperson told FORBES. “Over time, we’ve seen a big uptick in users sharing rich content, such as photos, videos and GIFs on WhatsApp. We wanted to offer a simple, secure, and reliable way for people to share this type of content with all their contacts at once.”

 

Although Status is for all intents and purposes a copy of Snapchat’s Stories, the feature actually goes back to the roots of why WhatsApp was built in the first place.

 

In 2009, when Jan Koum started building what would become the most popular messaging app in the world, he started off by building a status app.

 

“Jan was showing me his address book,” Koum’s friend and entrepreneur Alex Fishman told me for a profile on Koum in 2014. “His thinking was it would be really cool to have statuses next to individual names of the people.”

 

The idea was that if you were going to the gym, in a meeting, or had a low-battery, you could let people know the situation so they knew not to call you, or at least could know what was going on.

 

Hence the name, WhatsApp, or what’s up.

 

Koum got his friends to download the app and it basically worked, but it wasn’t getting much traction. Then Apple introduced push notifications, meaning that every time someone updated their status, everyone got “pinged.” So Koum’s friends started changing their status updates to things like “I’m on my way.”

 

Suddenly they weren’t just updating their friends, but sending a message. Rather accidentally, one of the most important pivots in Silicon Valley history – right up there with Uber introducing Uber X and blasting a hole in the taxi industry – happened almost overnight, and WhatsApp’s users quickly swelled beyond Koum’s circle of friends in San Jose, to 250,000.

 

Five years later, Facebook bought Koum’s former status-updating app for $19 billion, and the rest as they say, is history.

 

WhatsApp has always retained the original status update next to each user’s name, along with their profile photo. Today’s Status feature won’t replace that. It will be a new, separate tab with a + sign that takes users straight to the WhatsApp camera. Tap that button and you’ll also see updates from other friends and family – which is where the real behavioral change for WhatsApp users will come in.

 

Friends can reply to the new “status” by tapping the reply button, which will be sent as a new WhatsApp message.

 

WhatsApp said the feature would roll out to users from Monday 20 Feb., and would be “available soon around the world for iPhone, Android, and Windows Phone users.”

Snowflake’s Vision For The Rebirth Of The Data Warehouse

Snowflake’s Vision For The Rebirth Of The Data Warehouse

For too many companies, the data warehouse remains an unfulfilled promise. The work that was started with data warehouses to create a living, clearly defined source of truth about what is happening in a business has never really been finished. Far too few companies have achieved the data nirvana of creating a clearly defined, searchable, and scalable data warehouse. A smaller number still have complete metadata management, comprehensive data governance, and data lineage. Whe

15856216488_2aff247fc5_on these victors started to address big data, they didn’t toss their data warehouse, but rather learned how to extract signal from big data and added it to this beating heart of value.

In my view, Snowflake, a SQL-based data warehouse built from scratch on the cloud, is founded on the premise that it is easier to create data nirvana by:

 

  •  Implementing a cloud-native data warehouse to new levels of flexibility to adapt to workloads and self-service and simplified administration through automation.
  • Adding big data capabilities to a SQL data warehouse, instead of adding SQL to big data repositories.

 

By offering capabilities based on these ideas, Snowflake seeks to both overcome the challenges of previous generations of data warehouse technology and embrace big data. (DISCLOSURE: I have worked on research and content marketing projects with Snowflake, Teradata, and most of the other players in the data warehouse and big data space.)

 

What Went Wrong With Data Warehouses?

For a variety of reasons, the journey toward a data warehouse has not been victorious for most companies. That is not to say that data warehouses have been a failure. We must remember that the need for the data warehouse grew out of the proliferation of enterprise apps. The data warehouse emerged to collect information, create one version of the truth, and support reporting and analytics. This it has done.

But the ways that scalability was supported with pre-computed structures such as star schemas, the difficulty of changing the structure of the database, and the need for experts to configure and administer the data warehouse, eventually led to frustration. The development of MPP technology, BI suites and the later generation of self-service technology were aimed and solving some of these problems.

The arrival of big data really put the data warehouse to the test, less so because of volume, because the best MPP data warehouses are quite scalable. The bigger problem was the variety of big data and a vast array of new types of analytics, some of which were not easy to do on the data warehouse.

Snowflake essentially argues that data warehouse should be as flexible and easy to use as the best of the self-service technology that allow analysts to get their hands on both massive SQL data sets and big data directly. Technology like 1010data provides a specialized language to enable this. Snowflake says, let’s do it with SQL, and add support for variably structured big data as well.

Of course, the traditional data warehouse vendors such as Teradata, and specialized tools like Wherescape and Attunity Compose are aimed at solving some of these problems, as are Google’s Big Query and Amazon Red Shift. But Snowflake delivers something different from these choices, as I will explain.

 

In the end, the unrealized potential of the data warehouse has led to high levels of frustrations for everyone from the CEO who can’t understand why the businesses can’t use data more effectively to analysts who can’t get the data they need in a reasonable amount of time.

 

Can’t Hadoop Do It All?

 

Hadoop seemed to offer a chance at solving many of the problems mentioned. When the promise of big data emerged along with a new set of technology, it seemed reasonable to think of Hadoop as the one repository to rule them all. Perhaps by using big data technology, many companies have thought, we could finally achieve the data warehouse victory we always been after. After all, Hadoop was built to scale out, to handle all forms of data, and to allow any type of analytics. Also, even though Hadoop was not built specifically for the cloud, because it rose to prominence as the cloud has become mature, there are many cloud-based hosting options for Hadoop.

 

But it eventually became obvious the expand part of the Hadoop vendors land and expand strategy wasn’t working out. Getting data into a data lake is one thing. Managing that data and extracting value is quite another. This second step hasn’t been made easier by Hadoop, but by the emergence of Spark, which provides mechanisms designed for application development and various types of analytics.

 

But it also turns out that big data doesn’t stay big for long. It quickly gets distilled into tables and SQL is a great way to perform data analysis, especially in combination with the huge body of analytics and tooling that support SQL. This has led to a quest to put SQL on top of Hadoop, but most companies have found trying to use SQL on top of Hadoop even more complicated and exasperating than conventional data warehouses. Hadoop on SQL is a work in progress that started with simple queries against large data sets, which work reasonably well. But when you get to the complex SQL queries and multiple workloads that most data warehouses can handle, Hadoop-based SQL is not yet mature enough to handle them.

 

Because Hadoop did not turn out to be the one repository to rule them all, most companies have actually been left fighting a battle on two fronts: the fight to conquer big data and the battle to make the data warehouse work. The problem with this is that most big data technology seems to ignore the victories of the data warehouse and instead attempts to start everything over from the beginning. This is counterproductive.

Bob Muglia, CEO, and the founders of Snowflake noticed these trends and realized that a new synthesis was possible. Why not take some of the great aspects of big data technology, such as the ability to scale out and handle semi-structured and machine data, and add them to a SQL-based data warehouse that was built to deliver a cloud-built SQL database and embrace the zero-administration, ease of deployment, and auto-scaling of the cloud. All of the SQL skills your company now has remain relevant to the world of big data. Given that a huge amount of enterprise data is now generated in the cloud, it makes sense for the engine to process it to be there as well.

 

Based on this vision, Snowflake developed a strategy to win both the data warehouse and big data battles by building on the achievements of the data warehouse, the flexibility of systems such as Hadoop, and the true elasticity of the cloud. Essentially, through their work, the data warehouse has been reborn, not as what it has and hasn’t been, but as what Muglia sees as what enterprises have always wanted it to be, including a way to embrace big data.

 

What makes Snowflake so interesting to me is one of the company’s strategies. It’s easier to make a SQL data warehouse speak big data than to make Hadoop speak SQL. The engineering distance from SQL to big data turns out to be shorter that the distance from Hadoop to mature SQL. Snowflake’s argument is that by creating a data warehouse in the cloud that can truly scale to big data levels, and connect to semi-structured data, the data warehouse can become what it was always meant to be – what Hadoop tried to be.

 

I think this argument is made stronger because big data is quickly distilled as it is used and becomes columnar data of a manageable size that is best operationalized using SQL. Snowflake doesn’t solve all of the problems that prevent data nirvana, but it takes many off the table.

 

The Current State of Play

 

Here’s Snowflake’s approach in more detail. The company was founded on three key ideas. The first was to recognize the victories that emerged from the struggle to make data warehouses easier to use and more powerful. Second was to build on the accomplishments of cloud computing by reengineering the data warehouse to take full advantage of the cloud. Third, Snowflake sought to incorporate big data by remaking the data warehouse with the tools needed to easily handle big data.

 

As Muglia recently told me, “Understanding data is still very much a work in progress for almost all customers. Data is a huge opportunity for almost everyone and it’s very unrealized.” The reason for this is that the challenge in making data warehouses enterprise-ready is not always technological in nature. It is often most difficult to create a common understanding of the business – as in, what does everyone in a corporation need out of data and what are the tools needed to meet these requirements?

 

One company that achieved this vision on their own is Warby Parker, which I’ve written about a few times (“Why You Can’t Be Data-driven Without A Data Catalog” and “The Heartbeat Of A Data-Driven Culture: How To Create Commonly Understood Data”). Using Looker, they built a solution in-house, creating a definition of the concepts used to run their business, embedding them in a good data warehouse, and making them explorable and findable. Now the entire company is served by a foundation of clean, well-understood data. Data nirvana come to life.

 

Yet Warby Parker is the exception, not the rule: most companies have struggled. They simply do not have the time or expertise to create this type of advanced solution. Their frustrations are understandable. Often, they also lack an organized approach to data. But there are two other big reasons companies have failed with existing data warehouses: 1) scale and concurrency limitations due to lack of resources (for the larger use cases) and 2) ease of use. Additionally, data warehouses have not been able to handle big data due to the structure of the data itself. Traditional data warehouses cannot easily incorporate machine-generated and semi-structured data.

 

For in truth, the modern data warehouse wasn’t built to do what most companies need it to do in order to fight the two-front battle of big data and the data warehouse. Running SQL on Hadoop hasn’t worked as well as it needs to, and that’s because of the engineering distance issue. Making SQL run on Hadoop in a trivial way is far different from making a truly scalable SQL-based data warehouse. SQL is a relational technology and Hadoop was never built with a relational model in mind. And more broadly, Hadoop was not created with the enterprise needs at the forefront. The story of Hadoop has been one of gradually adapting a scalable data processing technology to enterprise needs. That story is still unfolding and most of the solution seems to be provided by Spark.

Existing data warehouse companies like Teradata, Vertica, Neteeza, and Greenplum are facing the challenge of making their technology as easy to use and as simple as public cloud technology. They are attempting to adapt their engines to the cloud model to achieve the simplicity and ability to scale of cloud native technology. Muglia thinks this engineering distance will be larger than these companies realize. Part of the reason for this is that traditionally, data warehousing solutions have the compute and data sides tightly coupled. They’re embodiments of the old adage of bringing the data to the compute. But this is actually causing the problem, because resources end up over tasked at any given time. To be fair, this limitation is high on the minds of all the traditional data warehouse vendors and they are working feverishly to address it. For example, Teradata recently announced support for AWS using the same engine as it uses on-premise. My recent article on Teradata (“Teradata’s Quest To Become The Perfect Cloud Data Warehouse”) describes their journey. It is the first entry in a series I am doing on evaluating the capabilities of cloud data warehouses based on the framework for comparison set forth in this story “What Should The Data Warehouse Become In The Cloud?” I will use this framework to evaluate Snowflake, as well as Google Big Query, and Amazon Web Services Redshift, other examples of the data warehouse based in or brought to the cloud.

 

One thing is for sure: All of the vendors are claiming they have separated compute from storage in one way or another. I have found it difficult to understand what each of them are doing and what impact it will have. I will address this point in later stories.

 

So what, exactly, is Snowflake doing that is different?

 

A Way Forward

 

For Snowflake, separating the compute from the data is key to overcoming the limitations of the traditional data warehouses. Making this division is what Snowflake’s model is based on. By doing so, they’ve created a new modern cloud infrastructure that is low on administration and high on automatic scaling, making it easier to bring big data to the data warehouse than most companies thought possible.

 

Muglia explained the thinking behind Snowflake’s approach. “We completely break apart data and compute so you can store as much data as you possibly want and do so very cost-effectively on the one hand, and throw compute resources against that in a way that is completely independent of the storage,” he told me.

 

Snowflake can thus have multiple sets of computing resources working on the same data at the same time. “We can allow for what is effectively infinite concurrency because we can throw multiple sets of computing resources against the data problem at the same time,” Muglia said. “And of course, underneath us we leverage the fact that there’s a cloud, which essentially gives us the ability to muster up those resources on demand for our customers.”

 

This architecture of modern data warehouse built for the cloud remains a crucial factor for companies. The reason for this is obvious: the more difficult you make it to access, analyze, or use data, the less likely people will make the effort to integrate it into their work. Semi-structured data and machine data have made the administrative and formatting side of the traditional on-premises or cloud data warehouse cumbersome, and therefore made the whole data extraction and exploration process seem intimidatingly complex.

Snowflake’s solution for this has been to ensure that semi-structured data can be treated as if it has columnar structure that can then be used with the relational SQL model. Muglia told me that this approach greatly facilitates a better user experience. “All customers have to do is load data and run queries,” he said. “We can load JSON, Avro, and XML into our database because we columnarize data, so we discern information about that data as we load it.”

 

Snowflake’s cloud-built data warehouse is software as a service, engineered for rapid scaling. Snowflake was designed by engineers for the enterprise. Muglia said his core belief about business today is that “data is the fuel of modern business and customers need to get an answer to their business problems.”

 

Snowflake is well worth watching going forward. The company is yet another indication that the data warehouse is not outdated – in fact, it’s still essential for enterprises. It just has to be reborn with innovations specifically targeted to help enterprises win the battle with big data.

 

Article courtesy of Dan Woods.

This Robot Will Carry Your Stuff and Follow You Around

This Robot Will Carry Your Stuff and Follow You Around

By Will Knight

32519347561_2446bdccff_o

Vespa maker Piaggio’s new robot servant is yet another sign of the transportation industry reinventing itself.

Inside an industrial building in Somerville, Massachusetts, I’m watching a robot follow someone around like an eager puppy.

The light-blue robot, called Gita, is almost spherical, with two wheels about the size of those you’d find on a mountain bike. A nearby laptop shows the world as perceived by the robot: a “point-cloud” of dots representing the 3-D shape of the room and the hallway outside, generated using a series of cameras attached to the bot’s body.

Gita was developed by Piaggio, an Italian automotive company that makes various lightweight vehicles but is most famous for making the iconic Vespa scooter. The robot is an experimental new way of transporting stuff. The top of the robot opens up, allowing it to store up to 40 pounds of whatever you might otherwise lug around yourself. The company is about to begin testing Gita in a number of industrial settings, including factories and theme parks. But the hope is that the robot may also appeal to consumers who might want a robot assistant as they walk, run, or ride a bike (it has a top speed of 22 mph).

Gita is a clear sign of the technological revolution currently shaking the world of transportation. As new technologies start to upend modes of mobility that have changed barely at all in decades, the automotive world is rapidly reinventing itself (see “Rebooting the Automobile”).

Jeffrey Schnapp, CEO of Piaggio Fast Forward, a subsidiary, says the company is trying to do something distinct in the transportation space. “A lot of focus is on automobiles and drones,” he says. “There are places where human-robot interaction makes sense.”

Piaggio created Piaggio Fast Forward 18 months ago. Its mission is to experiment with new modes of transportation and new technologies. The sensors, control systems, and electric propulsion used in the new robot could all prove crucial for future Piaggio products, says Michele Colaninno, chairman of the board of Piaggio Fast Forward. The new robot is also a natural extension of the three-wheeled scooters Piaggio makes for commercial use.

Still, as with many of the ideas being tested by transportation companies, including self-driving taxis, semi-automated trucks, and delivery drones, the underlying technology, as well as the potential applications, remain a bit unproven. Gita might be useful in some settings, perhaps for those who have to drag things around for delivery, but it’s less clear that such a robot would appeal to ordinary consumers. Although Gita features an obstacle detection system and can stop very quickly, it isn’t hard to imagine it running into people in a bike lane.

Gita balances itself as it travels, keeping its cargo level. The robot runs for eight hours, and can recharge in a regular outlet. It has three different modes: following someone, driving autonomously, and platooning with other Gita vehicles. The company has not announced a price.

Perhaps the most interesting thing about Gita is its sensor technology. Rather than more expensive sensors such as lidar, which bounce a laser off objects to build a 3-D picture, the robot maps its environment using video cameras. This involves comparing images captured at different points in time, and using the differences to infer the three-dimensional structure of a scene. Gita uses a stereoscopic camera for this and several other fisheye cameras to provide a 360-degree view around the robot. Schnapp says the video mapping system can be less reliable in poor lighting or bad weather, and they are considering adding a light to the robot to address this.

The robot follows people not by tracking them, but by comparing its view of the world to one captured from a set of cameras on a belt that’s worn by the person it’s following. This allows the robot to follow a person’s route long after he or she has traveled it. Although, honestly, the belt looks a bit dorky.

Piaggio hasn’t said when the robot will go on sale, and indeed it is a little rough around the edges. But it may not be long before the first robot helpers are spotted chasing their owners along sidewalks and bike paths.

Engineers eat away at Ms Pac-Man score with artificial player

Engineers eat away at Ms Pac-Man score with artificial

player

Ms. Pac

Using a novel approach for computing real-time game strategy, engineers have developed an artificial Ms. Pac-Man player that chomps the existing high score for computerized play.

In the popular arcade game, Ms. Pac-Man must evade ghost enemies while she collects items and navigates an obstacle-populated maze. The game is somewhat of a favorite among engineers and computer scientists who compete to see who can program the best artificial player.

The record score at the annual Ms. Pac-Man Screen Capture Competition stands at 36,280, but a trio of researchers led by Silvia Ferrari, professor of mechanical and aerospace engineering at Cornell, has produced a laboratory score of 43,720.

The score was achieved using a decision-tree approach in which the optimal moves for the artificial player are derived from a maze of geometry and dynamic equations that predict the movements of the ghosts with 94.6-percent accuracy. As the game progresses, the decision tree is updated in real-time. The strategy is detailed in the study “A Model-Based Approach to Optimizing Ms. Pac-Man Game Strategies in Real Time,” to be published by the journal IEEE Transactions on Computational Intelligence and AI in Games.

“The novelty of our method is in how the decision tree is generated, combining both geometric elements of the maze with information-gathering objectives,” said Ferrari, who noted that the information in this case is the fruit Ms. Pac-Man collects for bonus points. Her team is the first to mathematically model the game’s components, whereas previous artificial players were developed with model-free methods.

Engineers take an interest in artificial players because they provide a benchmark challenge for developing new computational methods that can be applied to practical needs such as surveillance, search-and-rescue and mobile robotics. “Engineering problems are so complicated, they’re very difficult to translate across applications. But games are very understandable and can be used to compare different algorithms unambiguously because every algorithm can be applied to the same game,” Ferrari said.

What began as such an exercise became a spectacle in 1996 when Deep Blue, a chess-playing computer developed by IBM, defeated world champion Garry Kasparov in their first match. However, it took Deep Blue 11 more matches to defeat Kasparov again.

Ms. Pac2Ferrari’s Ms. Pac-Man player faces its own challenges against human players. The study found that the artificial player was not able to average better scores or produce higher scores against humans who routinely played the game.

“It’s very interesting which problems are easier for humans and which are easier for computers,” said Ferrari. “It’s not completely understood right now what elements of a problem allow humans to outperform computers and it is a question we are investigating with neuroscientists through collaborative projects supported by the Office of Naval Research and the National Science Foundation.

“In the case of Ms. Pac-Man, our mathematical model is very accurate, but the player remains imperfect because of an element of uncertainty in the decisions made by the ghosts.”

However, Ferrari’s model did produce better scores than beginners and players with intermediate skills. The artificial player also demonstrated that it was more skilled than advanced players in the upper levels of the game where speed and spatial complexity become more challenging.

While the Ms. Pac-Man Screen Capture Competition is now on indefinite hiatus, Ferrari said she may still revisit the project and improve the artificial player by adding a component that would allow it to autonomously learn from its mistakes as it plays more games.

 

 

Materials provided by Cornell University. Original written by Syl Kacapyr. Graphic courtesy of Rob Kurcoba

What Makes Rails a Framework Worth Learning in 2017?

What Makes Rails a Framework Worth Learning in 2017?

Courtesy of David Helnemeler Hansson at Quora

 The same reasons why it was a framework worth learning in 2004. The more things change, the more they stay the same. While we’ve seen a lot a progress in the JavaScript world, we’ve also seen a regression to the complexity-laden world that Rails offered refuge from in the early days.

Back then the complexity merchant of choice was J2EE, but the complaints are uncannily similar to those leveled against JavaScript today. That people spent hours, if not days, just setting up the skeletons. The basic build configurations. Assembling and cherry-picking from lots of little libraries and frameworks to put together their own snowflake house variety.

2701495173_7c322c972a_o

The core premise of Rails remains in many ways as controversial today as it was when it premiered. That by formalizing conventions, eliminating valueless choices, and offering a full-stack framework that provides great defaults for anyone who wants to create a complete application, we can make dramatic strides of productivity.

It’s somewhat surprising to me that despite the astounding success of Rails, that there hasn’t been more competition for this proposition. The vast majority of activity today is for yet another option on the a la carte menu. Yet another build system, yet another view library, yet another ORM. Very little activity in integrated solutions.

I guess the answer is that the foundational proposition of Rails continues to cut against the psychological grain of most programmers. That by reducing choices and accepting community conventions and answers to most of the basic questions in web development, you end up better off. Less unique, less tailored, but in ways that just don’t matter anyway.

Anyway, that’s the big ideological appeal of Rails. I’ve elaborated further on convention over configuration, the a la carte/omakase conflict, the appeal of integrated systems, and other core values of the Rails community in The Rails Doctrine.

After reading that, you’ll probably have a pretty good idea as whether Rails is something for you or not. If you can’t recognize any of the struggles outlined in that document, or you just don’t like the solutions presented to those struggles, the particulars of Rails technology probably doesn’t matter much. If that document resonates, or at least piques your interest, read on.

On top of these ideological choices, we’ve built an incredibly pragmatic and multi-paradigm web framework. When people hear “web framework”, they sometimes think, “oh, that’s just some stuff to generate HTML, right?”. And in that definition, some might see it as though Rails competes against something like React. And I suppose it does, but in a very remote way that isn’t very useful to thinking about whether Rails is right for you or not.

As I talked about above, Rails has an incredibly ambitious mission. In the full-stack goal lies a pursuit to deal with just about every piece of code needed to connect databases and no-sql stores to a business domain model written in Ruby to a set of controllers that expose that model via REST and then, yes, finally to HTML. But that last step is a small minority of the code and focus of Rails.

So if you think that client-side MVC, React, Angular, or whatever is The Future, then you’re still squarely in the target audience for using Rails. Because the bits you use to design your HTML/JavaScript-based UI still needs to connect to a back-end domain model that saves stuff to the databases, computes things, enqueues jobs for later processing, sends out emails, triggers push notifications, and all the other stuff that real apps need to do.

And that’s where the meat of Rails sits. In what happens once that POST or PUT or GET is triggered. Now, as I said, Rails is full-stack by default. So of course we also include answers for how to generate and update HTML. We have some phenomenally productive answers in Turbolinks and SJR, but even if that path doesn’t appeal, everything that leads up to generating that JSON is still stuff we’ll have in common.

Anyway. That’s a very long pitch for two basic tenets of Rails appeal in 2017: 1) We have a unique ideological foundation that’s still controversial today and offers the same benefits against the mainstream choices as it did thirteen years ago, 2) We have a pragmatic, full-stack answer that could be formulated based on that ideology that still offers amazing productivity from the second you run the rails new command.

Oh, and on top of all that, I’ve saved the cherry for last. You get to use Ruby, which, even in a world that has rediscovered the benefits of functional programming and immutability, remains the most extraordinarily beautiful and luxurious language I’ve yet to encounter. Just look at some code. I dare you not to fall in love.

 

Teach Your Kid to Code, Here’s How…

Teach Your Kid to Code, Here’s How…

little boy with a laptopGive your children as many opportunities as possible to try programming.

Many parents think that programming is too complicated for their kids. Nothing could be further from the truth.

I used to teach coding classes with my co-founder, Marco Morawec, to children as young as seven years old:

We ran a four hour workshop in Boston that taught seven to ten year olds how to build a computer game. The students learned how computer programs worked and learned the fundamental building blocks of everyday computer programs (arguments, methods, logic, if/else, loops, etc.).

We also ran a more advanced class for eleven to fifteen year olds that taught them how to build a functional phone application. We used something called App Inventor, a coding environment developed by Google and MIT, that makes it easy to build this type of stuff without fancy hardware requirements.

Coding is the literacy of the 21st century. On top of that, it’s incredibly rewarding! Kids love to create, have amazing imaginations, and if you give them a chance to build something cool with code, they’ll be so excited to show it off to you and their friends.

If your children are already interested in learning to program, that’s awesome. To help foster that interest, I’d recommend a few things:

Sign them up for coding workshops. You can find a lot of these on meetup.com.

Learn together with your kids. I started programming because my dad had a programming book. I grabbed it from his office and went through much more of it than he ever did. I thought he was going to be mad that I stole his book. But he was actually super happy that I was doing it. This made me want to keep learning.

Don’t worry too much about which language you choose. The path to becoming a self-sufficient developer, who understands how to continuously learn new programming concepts and languages, is more important than the details of any particular programming language itself. Just pick a language and stick with it. We have an online course on HTML/CSS and Ruby if you want to start there.

 

Article courtesy of Ken Mazalka on Quora.

Pong’s Inventor Wants to Bring Virtual Reality to Arcades

Pong’s Inventor Wants to Bring Virtual Reality to

Arcades

by Rachel Metz

 Visitors At The Eurogamer Expo 2013 For GamersIn 1972, Atari founder Nolan Bushnell invented Pong, a version of table tennis that, in many ways, launched the video-game industry. Forty-five years later, Bushnell is using that same simple game to test the waters for virtual-reality arcade gaming.

 

Bushnell’s latest venture is a company called Modal VR, which is building its own wireless virtual-reality headsets and games that it plans to roll out in places like arcades, malls, and movie theaters in the coming months. On a recent Saturday in San Francisco, Bushnell—now a grizzled guy in his mid-70s sporting a Patagonia pullover and black Modal VR hat—sat quietly in the audience as a line of people of all ages shuffled through a classroom-size open space. In pairs, they donned a prototype bulky black headset and played Pong in virtual reality, running from side to side to control the game’s simple white paddles.

 

It was fitting, considering “we’re at the Pong stage of VR,” Bushnell said with a smile.

 

While several high-end headsets were released last year that can bring virtual-reality experiences to your living room, adoption of the technology is still in its earliest days for a bunch of reasons—it’s still bulky, expensive, and there isn’t all that much to do once you’ve got it on your face. More than two million headsets were shipped worldwide in 2016, according to an estimate from market researcher Canalys, but this figure pales in comparison to the popularity of, say, video game consoles (sales of the leading one, Sony’s PS4, topped six million during the 2016 holiday season alone).

 

Consumer virtual reality will likely catch on as prices come down and headsets improve. In the meantime, though, a number of companies are betting that consumers may be happy to pay a much smaller amount to try the technology with their friends at, say, an arcade, theme park, or bowling alley.

 

Modal VR says its technology lets up to 16 users explore virtual reality in large spaces—as big as 900,000 square feet, in theory, which would offer a lot more mobility than you typically have in VR these days—and it has built a headset and three games so far (it also plans to let developers make games for its platform). One is a fighting game called Mythic Combat and another, called Project Zenith, is a first-person shooter game set in outer space. Pong is not meant to be one of its offerings—it was originally put together as a joke, in homage to Bushnell’s past—but the company decided to use the simple two-player game anyway to demonstrate what it’s working on at the World’s Fair Nano technology fair in San Francisco in late January.

 

Jason Crawford, the company’s founder (he piqued the interest of Bushnell, who is listed as a cofounder), says its headset is meant to be rugged enough for 11-year-olds to beat the hell out of at arcades. It doesn’t need to be tethered to a computer, and can be used to play games with other people who are right there with you, he says, though he declined to share specifics on the technologies behind it.

 

Carly Kocurek, an assistant professor at the Illinois Institute of Technology who studies video gaming and its history, agrees that for many people it makes sense to try virtual reality out in a more arcade-like setting, especially since it could come with instructions on how to use the technology.

 

“I think people are looking for reasons to be out of their houses,” she says. “There are a lot of people who don’t drink or don’t want drinking to be the only thing they do.”

 

One problem she foresees, however, is that people often become a spectacle for others when they’re wearing VR headsets—something she noticed on a recent trip to Las Vegas where she walked by headset-clad folks in the midst of a virtual-reality experience.

 

VR Pong players Nirav Doctor and his 10-year-old son, Sur, didn’t seem to mind, though. They were among those who took a turn running blindly from side to side in an area about the size of a classroom, acting as human paddles hitting a virtual cube back and forth while a small crowd watched their gameplay on a display overhead.

 

The game was only about two minutes, and the graphics were not much more advanced than on the original Pong, but both came away impressed. Sur, who won, liked that he had to move around a lot to keep scoring; Nirav said it was fun to play with his son.

 

Would they be willing to pay for something like this, though?

 

“Yeah, I think so,” Nirav said. “Once in a while.”