Social Distortion – Myth Busting

From a marketing perspective, social media has become something akin to the World Wide Web in the 1990s. Organizations know they need to participate, but they’re not quite sure what to do once they get there. The more communications-orineted businesses have hit the ground running, using social media as extensions of their other channels, augmenting their existing strategies and taking to the medium as naturally as they did the web when it first burst onto the scene as the game-changing medium that it was.

The question now is how do organizations whose core business is not communications-oriented integrate social media into their business strategy? There have been a number of “Social Media Experts” who have popped up to claim they have the answer to this question, and unfortunately they have given the clients wrong advice, or worse, simply lied to them about the effectiveness of their work. At the end of the day, just because you understand the dynamics of social media does not mean you understand your client’s business, and if you don’t understand your client’s business, ultimately, you won’t be successful.

Social Media Mistakes

Social Media Mistakes

Our objective for this “social distortion” series is to:

  • Dispel some of the common myths that are perpetuated about social media
  • Show you how organizations are getting it wrong, and
  • Demonstrate some tactics that will help you define and achieve your social media goals.

Common Myths about Social Media

We took a look at some research conducted by Forbes and >Inc. to identify what we at Agentes Consulting consider to be the most common myths about social media.

1. Everybody is on social media
While social media has grown exponentially over the past few years, the reality is that it’s simply not on everyone’s radar as a business consideration. The key here is to understand YOUR customer base, then identify those within that group who are on social media before you kick off an effective engagement strategy.

2. Social media is solely a broadcast channel
There are many businesses that look at social media as a one-way medium, “a cheap and powerful bullhorn” that allows them to reach a large number of people. This is true, but the key to effective communication is engagement, not broadcasting. Find a way to genuinely engage your network instead of just selling to them.

3. You can’t measure your return on investment in social media
Oh, but you can! As you’ll see later in our “Social Distortion” series, there are many way to objectively quantify your return on investment for social media expenditures. There are tools to track your users, understand how they engage with your digital ecosystem and ultimately calculate that activity into a monetary value will give you a precise ROI.

4. If it doesn’t go viral, it wasn’t worth the effort.
While viral campaigns can have a great impact online, the real value to a business in the long-term is the ability to add value. The more potential customers know about your level of expertise in your field, the more likely they are to buy.

5. The more often you post to Facebook, the better your campaign will perform.
For most companies, posting more than once a day is probably too much. “Facebook’s EdgeRank algorithm tends to favor posts from companies that have higher engagement rates,” says [TBD]. “Posting too often may mean that fewer people will see your future posts.” Statistically speaking, posts with images, videos and links perform much better than those comprised of just text. The takeaway here is to consider the quality of your posts, not just the quantity.

5 Ways Small Businesses Get Social Media Wrong, from Mashable

This recent piece is so relevant and applicable that we’ve extracted key elements for review and discussion. You can view the full article here.

1. Social Media Isn’t the Place for the Hard Sell
Social media is all about building relationships and growing trust. This means answering questions, providing helpful information, responding rapidly and serving as a trusted resource. Only five to ten percent of your social media activity (i.e. status updates or tweets) should be self-promotional.

2. Social Isn’t About Self-Promotion
Small businesses need to treat social media like a cocktail party among friends. To be liked, you have to be gracious, genuinely interested in others, and not dominate the conversation.

3. You Don’t Have to Be Everywhere
Navigating social media successfully doesn’t mean you need to be anywhere and everywhere. Instead, it’s about choosing one or two of the most relevant and effective channels for reaching your customers and focusing on those. Remember that a tepid or neglected social media presence will reflect poorly on your business.

4. You Don’t Have to Keep Up With the Big Brands
If you’re running a small business, you know there’s a big difference between your budget and that of Virgin America or Starbucks. Creating giveaways and contests is one of the most effective ways to generate new likes and improve overall engagement. For example, for your small business, don’t give away a bunch of iPads that you can’t afford. Instead, consider giving discounts, coupons or samples of your company’s services.

5. Social Media isn’t “Free”
Social media is far from free once you factor in the manpower it demands. These channels require constant commitment, from keeping fresh content on your accounts to engaging your community. If one employee spends approximately ten hours per week managing social media accounts, you can assign a hard cost to the effort. Small business owners need to understand the numbers behind every campaign, and that means factoring in everyone’s time and energy.

Next in our “Social Distortion” series, we’re going to delve into how organizations flounder by focusing on the wrong demographics and choosing the wrong social media channels.

Advertisements

Cybersecurity, Human Factors & User Experience – Part 3

In 1965, Gordon Moore predicted that computing power would double every two years. Moore, who co-founded Intel, was surprisingly accurate in his rough estimation. What does this imply?  Your current computer is likely to be at least 10,000,000,000,000,000,000,000,000 times faster than a computer from 50 years ago, and that’s if you have a slow and outdated machine.

Yet it’s also a deceptively simple picture of the evolution of technology. As Niklaus Wirth observed in 1995, “Software is getting slower more rapidly than hardware becomes faster.” Google founder Larry Page restated this in 2009 – quite a credible endorsement in the tech world that was ultimately renamed ‘Page’s Law.’ This trend has proven as valid as Moore’s Law. Software obviously it isn’t getting slower, per se, but in relation to its hardware counterpart, it lags.  

This stems partially from the fact that as software becomes bigger, the number of bugs increases exponentially (not linearly). Bugs beget more bugs, and their relation to each other becomes more cryptic as software grows in size. Debugging is a huge part of developing software, often the most expensive line item in the cost of a project overall.

The bugs and loose ends that haunt R&D beneath the surface are usually a hacker’s secret back door. It’s the things we don’t know we don’t know that threaten us the most. 

As the evolution of computer programming continues, but as it becomes increasingly complicated for its creators, new vulnerabilities for end users keep cropping up as well. We’re standing on the shoulders of giants every time we boot up a computer, tablet or smartphone. Our user experience is more complicated and overlapped than ever before, with social media, banking, networking, photo and video sites, and email all interconnected. Yet we only see about 10% of what comprises a user interface. While the gears under the hood purr obediently, we take security and functionality for granted.

Think of the cyber landscape as an iceberg, with 90% of its mass beneath the surface, ambiguous to the eye and difficult to measure. Unfortunately, many of us make decisions based on the 10% of the data we have readily available. This can be deceptive – symptoms are not necessarily root problems, and causation can be hard to decipher.

While software engineers and architects grapple with a behemoth set of bugs, holes and vulnerabilities that help keep things secure on the back end, smart users should take on some of this responsibility as well.  Vigilance and caution are the keys to avoiding hacker attacks. A system is only as strong as its weakest link. If we bite a worm with a hook in it, we not only compromise ourselves but also our co-workers, friends and employers. We might even inadvertently expose sensitive or classified data that could threaten anything from a small business to our national security.

As consumers, we must stay abreast of new scams, bogus apps, and other potential threats that might be introduced into a system due to our clumsiness. All encryption, password protection, even voice and facial recognition safety measures fly out the window once a human user overrides these measures with an approving mouse click. A recent study showed 93% of computers have antivirus software installed, yet only 23% of smartphone users said they intend to install such software. On top of that, only half of all mobile users even lock their phones.

These are just basic front line measures. With smartphone threats on the rise, we’re sure to see new, prolific viruses making their way through the mobile OS world. The idea that everything is safe as long as you have antivirus software is an outdated concept. A zero-day virus (immune to all known antivirus software) can find its way onto a machine with the help of a human mistake, through a technical weakness, or by a combination thereof.

One of the most common ways humans betray their own security is by falling for ‘spear phishing’ schemes, many of which rely on a genuine-looking but utterly counterfeit UI. The ruse can take many forms, like a Facebook icon saying a new friend has invited you to do something, or that your password needs to be or has already been changed. Or, you may receive a fake eBay or Amazon confirmation email with an authentic-looking logo warning you that your credit card has just been charged. Fake emails that appear to come from a coworker or boss are common as well.

We take for granted the everyday user experiences we have with websites and applications we trust. Phishing schemes play on that very familiarity and sense of comfort. Familiar icons and logos automatically register in our brains as safe.

Cyber security could also be described as ‘cyber vulnerability.’ Since the Web became publicly accessible in 1995, we have gone from a simple boxy PC plugged into a phone jack to an era of smartphones, 4G tablets, laptops, and cloud computing. As consumers and end users, we cope with many more dangers in the digital wild than ever before. What we see on the surface is not all that meets the eye.

Our experience as users should be one of utility and convenience; this lies at the heart of UX and UI design. We just need to remember to use caution and skepticism as well when we navigate the potentially perilous open seas.

Cybersecurity, Human Factors & User Experience – Part 2

Cybersecurity, Human Factors & User Experience is a multi-part series examining the impact of User Experience Design on cybersecurity written by Stephen Ruiz

Part 2 – Information Disasters (Tufte, Space Shuttles & Design Lessons that Save Lives)

 

“Good displays of data help to reveal knowledge relevant to understanding mechanism, process and dynamics, cause and effect.”

– Edward Tufte

When the Space Shuttle Challenger exploded on January 28, 1986, most observers believed that seven people lost their lives due to poorly functioning O-rings on a booster rocket. If you’re well versed in the theory and practice of Visual Design, you probably know the work of Edward R. Tufte, a professor emeritus at Yale University and an expert in the field. Dr. Tufte has long argued that misleading visual explanations (also known as bad design) was really at the heart of what went wrong.

As evidence, Tufte cites charts given to NASA by Thiokol, the makers of the solid-rocket booster that in essence failed to communicate vital information about temperature and damage to the O-rings on previous flights. Tufte then designed his own chart, which demonstrated how the same data could have been presented in a way that showed the relationship between temperature and O-ring damage more clearly.

(Chart designed by Thiokol showing the relationship between temperature and O-ring failure. Temperatures are vertical, O-ring problems are squiggles. Two pictures per launch analysis. )
 

  

(Chart designed by Tufte. His solution uses two dimensions, horizontal and vertical. As the temperature dips lower on the left, the damage to O-rings goes up.)
 
 

This topic has generated a number of opinions and inspired exhaustive amounts of research. Our goal isn’t to toss yet another theory into the fray. The real takeaway here is that an interface-level design flaw resulted in disaster. This is a User Interface problem. It’s a tragic (and very public) example of the impact of information design upon our best and brightest. Remember, the charts were presented to the top experts at NASA, who quite literally were rocket scientists.

If rocket scientists couldn’t make a life-or-death decision based on a low-tech chart in 1986, how can we expect better results from average citizens in the immersive, high-tech environment of 2013?  

The truth is there should be no such expectation. The rapid evolution of technology has outpaced our ability to contain all possible negative outcomes. Since the Challenger disaster in 1986, digital cellular phones were invented, the World Wide Web was created, the CD replaced the LP, the MP3 replaced the CD and the world is now hyper-connected via social networks.  

In short, we’re all increasingly submerged in data. Private information about our finances, our health, and our personal relationships is regularly transmitted electronically. As greater aspects of our lives are dependent on our own personal digital ecosystems, Cybersecurity threats are bound to have a greater impact on everyday citizens. 

For maximum safety and security, today’s average smart phone users should also be Cybersecurity experts. But herein lies the problem. Most people simply don’t have the time or the training to safeguard their data properly. So what’s the solution? This expertise should instead be baked into the products, software and apps they use every day. Simple but effective security measures must be a basic aspect of the design.

For those of us in the business of designing these systems, this calls for a fundamental shift in how we do our jobs. The urgency has never been greater. We must start thinking not in terms of a “better interface,” but in terms of a new visual vocabulary. We must learn to design for people, not for machines. 

In August 2012, a San Francisco Design and Strategy firm called Cooper posted a piece by Golden Krishna titled “The best interface is no interface.” The premise is that a good user interface is one that’s so seamlessly designed into the product that it’s as though it’s not even there. 

This is a perfect segue into our next topic. Check back next week for an exploration of “The Iceberg Principle” as it applies to user interface design. 

Cybersecurity, Human Factors & User Experience

Cybersecurity, Human Factors & User Experience is a series examining the impact of User Experience Design on cybersecurity written by Stephen Ruiz

“Far from being an alternative to conventional war, cyber war may actually increase the likelihood of the more traditional combat with explosives, bullets, and missiles. If we could put this genie back in the bottle, we should—but we can’t. Therefore, we need to understand what cyber war is, to learn how and why it works, to analyze the risks, to prepare for it, and to think about how to control and deter it.”

– Richard A. Clarke, Counter-terrorism adviser to Presidents Bill Clinton and George W. Bush.

Besides the usual rhetoric of “the need for peace talks” and “a long-term solution to the problem,” the most recent flare-up in the decades old conflict between Israel and Hamas this past November may have broken new ground in its use of cyber war tactics. While cyber war is not a new topic in the nation’s zeitgeist, the number of reported incidents has grown exponentially over the past few years. This video from CNN describes the incidents of cyber attacks that relate to the conflict.

With some security analysts predicting that 2013 is the year nation-sponsored cyber-warfare will go mainstream, it’s no wonder that leaders from several western nations have made cybersecurity a top priority within their governing agendas. In his 2013 State of the Union address, President Barack Obama outlined his executive order addressing cybersecurity: Improving Critical Infrastructure Cybersecurity. This is no longer just an issue for those in the industry. The topic has moved from the realm of tech-savvy people and digital professionals and onto our national stage. Cybersecurity has gone mainstream.

Now that cybersecurity is an integral part of Homeland Security’s focus along with the vulnerability of critical infrastructure (power, water, and nuclear systems), two things have become abundantly clear. One is that with the increase in attacks (along with the proliferation of connected devices like smart phones, tablets, computers and even household appliances), the responsibility of keeping data and systems secure will require more and more input from people who are not technical professionals. The second is that we need to place a greater emphasis on User Centered Design, not just for ease of use but for the sake of our safety.

The very real (and very scary) truth is that is human error accounts for a staggering number of security breeches. Here are some frightening statistics from a chiefexecutive.net article from 2011:

  • In October 2010, Microsoft blamed human error after two computers on its network were hacked and then misused by spammers to promote more than 1000 questionable online pharmaceutical websites. 
  • In April 2011, the State of Texas discovered that the personal and confidential data of 3.5 million teachers, state workers, retirees and recipients of unemployment checks had been left unprotected on the Internet nearly one year. According to Gartner, Inc., more than 99 percent of firewall breaches are caused by misconfigurations rather than firewall flaws. 
  • The State Department’s 2008 breach of the passport system was a result of under-configured access control and a defendant’s “idle curiosity” peaked by the simple discovery that he ‘could’.”

Hypothetically, one could have the most sophisticated technology imaginable, but if it isn’t intuitive or designed to account for human error, then it simply would not be effective. So, what can public and private-sector organizations do to address the problem of human error? The first step is to design for humans.

As we’ll see in the next installment, Part 2 – Information Disasters (Tufte, Space Shuttles & Design Lessons that Save Lives), smart, user-focused design can literally mean life or death.