VCIO – Better IT for Small-Medium Organizations

The Problem of CIOs in Small-Medium Organizations

Organizations are designed with responsibilities and personnel capable to carry them out.  These are then placed in a hierarchical structure we see reflected in an “org chart”.  One of the boxes on this chart an organization sees a need for is a CIO or CTO, to reflect the need for somebody to be responsible for the information technology function and have the capabilities to carry it out.

But small organizations rarely start off with a CIO; instead they start off with a junior help desk person or perhaps a network administrator.  The cost of starting off with a highly skilled CIO is too high and having them do helpdesk work or fixing printers is not an effective use of budget.  But having the low-end skills of a help desk person doesn’t provide the organization with the ability to align the business and technology. IT personnel new in their career or more junior by default focus more on the technology and gadgets than they do on the business needs and technology that would solve their problems.  They request a lot of budget, but bring little new ideas to production that will help the business. By having the junior person on staff the organization is not only faced with a lack of necessary skills, they also now must manage a skill they have little understanding about.  This creates ineffective decisions and misspent funds.

You might consider at this point “Why not hire a CIO?”  For organizations of less than 500 users a CIO would be a luxury. They are a fairly expensiDavid Howard VCIOve proposition, and with the revenues of the organization they would not have enough funds to invest to take advantage of their skill set. They would have great ideas about implementing new sales force automation, e-commerce systems, inventory management and other exciting prospects for the business, but then face a budget that does not allow them to implement them.

What is the solution then? How can an organization have these essential skills and yet spend the right amount for their size?

Solution: A VCIO

A “Virtual CIO”, or as we will refer to it, a VCIO, is a fractional or part-time resource that can work with the business leaders of the organization and oversee IT operations.  They may work between 16 and 40 hours per month depending on the investment level and project status.  Sometimes they may be close to full time, but generally it will be much less than that.  They will be present for key meetings about the business strategy or perhaps the weekly operational meeting to review business status, and will do work on their own to review options, speak with IT staff, or end users about key issues.  You as an organization will pay for only what you need.

VCIOSome capabilities and services of the CIO can offer you are:

  • IT Strategy: develop an overall plan for information technology based on the discussions with business leaders and understanding of the industry and market conditions.
  • Understand the business: the key issue here is that the VCIO will be a part of your management team, and will listen from a technology perspective to the problems the business has. Every person on the management team comes from a different focus, a sales manager will talk about how to increase sales, or a production manager will be focused on how production works with the rest of the organization. The VCIO will listen from a technology perspective so that when they hear problems they will contemplate and suggest options that employ technology in ways that others on the team have not considered.
  • Design the technology platform, policy and process, and personnel for the current and future needs of the business. These are the three key aspects of IT: the technology put in place such as which ERP system or servers to use, the process IT uses to support and manage systems and users, and the personnel and their roles within the IT department. The VCIO can align these to provide a fully effective IT function that meets the needs of the business.
  • Business Process Mapping – people with an IT skill set are often very keen on the business process and flow of data through the organization. A skilled VCIO can help you map the business process and identify bottlenecks and points where efficiencies can be gained either by implementing technology or changing a business process.
  • IT Roadmap aligned to the business: they can develop a roadmap of technology changes and suggested projects with business analysis included to show you how the future can look with technology options.
  • Benchmarking IT capabilities against industry-standards: most markets are competitive, and technology is an enabler to increase the velocity of the business. To the extent a business uses IT well compared to others, they will have a greater velocity of revenue generation, product creation and production, and cash management. These key aspects of the business must exceed the competition or you will look up from the bottom of the market. A VCIO can help you benchmark and improve this.

Having a VCIO will provide your business with a single point of contact for IT needs that aligns to the business, a partner invested in your long-term needs, and a technology leader to sit with the CEO, CFO, and other business champions and make sure the technology is an enabler for the business.

Contact us today to discuss how we can help you for a manageable cost and maximum results.

Bimodal IT – Maintain the Now and Plan for the Future

 

Keystone Technology Museum

Bimodal IT is a methodology being adopted by organizations so they have a division of focus and effort between taking care of the current or legacy technology, while also looking to the future and what is needed to sustain the organization as business and technology cycles continue to ripple.

Every organization follows a methodology, whether it is formalized or not. This is true of the organization and the key components or departments. Your accounting department follows a methodology according to Generally Accepted Accounting Principles (GAAP), your production and quality departments may follow methodologies directed by ISO standards for documentation, and Lean or Six Sigma for means to manage and measure production. The information technology department is no different though it often lags behind other departments in terms of formality in methodology. If a methodology is identified, trained, communicated, and followed it can assist the IT department in working with the business to align the technology to the needs of the organization.

One of the trends in Information technology management methodologies is Bimodal IT.

What is Bimodal IT?

Bimodal IT is an approach to information technology where two areas are in focus, with expected results established for each.

The first area is the traditional IT function which remains highly valuable: the normal “keep the current systems reliable, secure and performing” so the business can deliver on its plans and promises. The emphasis here is on safety, accuracy, reliability, and scalability.

The second area is innovative (or fast mode), and emphasizes speed and agility.

A great CIO will struggle to compete with small, disruptive startups that threaten the business. The startups do not have the overhead an existing IT operation must maintain, and are not limited by the lack of focus on something new.  They can be fast and agile. But a good CIO can simply shift resources to be focused on innovation.

Gartner research has studied this trend, where Peter Sondergaard, senior vice president and global head of research, said “CIOs can’t transform their old IT organization into a digital startup, but they can turn it into a bimodal IT organization. Forty-five percent of CIOs state they currently have a fast mode of operation, and we predict that 75% of IT organizations will be bimodal in some way by 2017.

Bimodal IT is simply a shift in some resources, with goals of speed and agility to develop options and solutions for current and future problems.

What Problems does Bimodal IT Address?

There are several problems this methodology is addressing.

  • Keeping IT current so the organization does not fall behind – a key business case for the danger of neglecting upgrades and innovation in IT is the New York subway system (discussed here). The subway system was designed and built in the 1930s to provide for safety and largely avoid collisions between trains.  The article’s author, Bob Lewis points out the details and the estimate to replace it was set at 20 billion dollars.   The obsolete technology’s issues had been known for a long time, and discussions held to plan its replacement in an orderly fashion for the budget and operational cycles had been thoroughly designed and vetted, right?  No, the discussions followed the same path they do in most organizations facing a potentially expensive replacement of a legacy system (the following italicized text is from Mr. Lewis article): Does any of this sound familiar — a legacy system that would be good enough except its architecture is obsolete, the platforms it runs on aren’t around anymore, and:
    • “Lift-and-shift” replacement provide no new features, and so no business-driven value to justify the expense?
    • Nobody can describe important new features that would justify anything more than a lift-and-shift replacement?
    • Investing in any replacement system would drain needed capital away from other efforts that are also important for the organization’s ongoing survival and success?
  • Increasing Value – the most persistent complaint from business leadership about IT is that it is unreliable. Once that is solved, the second most persistent complaint is that it is not adding value to the business. The IT department spends their budget and focus to “keep the lights” on, but never comes to the table with investment opportunities with clear ROI that will help the business. A lot of IT shops spend 85% of their budget on maintaining what is, rather than thinking about what could be.  This holds back 70% of IT leaders from focusing on innovative projects that will increase business value.  Bimodal IT allocates a certain percentage of the IT function to the future needs, and should be associated with an accountability to develop innovative options for the business.
  • Attracting great talent – Great talent in technology likes to work on interesting projects, so having projects that are more than just point upgrades will attract and retain people with better skills and ability to deliver innovation. They will add value in multiple ways in all areas of IT.

Is Bimodal IT a Fad, or Will It Help Me?

Maybe it is a fad term; the phrase Bimodal IT may go the way of “zero defects”, “Total Quality Management”, and other names for lost methodologies. But the concept of planning the orderly replacement of obsolescent technologies and developing new options is a good thing, no matter what it is called. This requires a focus on new solutions to the changing technology landscape and business challenges.

We focus on the actual goals, not the terms. And there are other methodology options that people are passionate about, such as Dev Ops or Agile which are also good.  The main thought we raise is we believe some of your efforts should be focused on the future, where technology can make an impact negatively if not dealt with (New York subway), or make a positive impact on a growing organization (disruptive technologies that provide a competitive advantage, e.g. “news blogs vs traditional newspapers”).

To maintain the now, and plan for the future, you will need a strategy to generate new options (innovations) which can be implemented (accountability).  This will help you avoid the negative and benefit from the positive.

Keystone’s Bimodal IT

We spend a lot of time looking at the current systems via monitoring tools, reports, and visual review.  We also look at the future: we just returned from the Consumer Electronics Show (CES) in Las Vegas, where we witnessed numerous trends in robotics, product development, monitoring with connected devices, and so much more.

We also published two articles on Technology Frontiers you may enjoy, part 1 and part 2.

One of the features we have at Keystone is a technology museum.  You may wonder what does a museum have to doKeystone Technology Museum - Bimodal IT with Bimodal IT?  It does for two primary reasons.

The museum has items that come and go to keep it fresh, but starts in t1800s technologyhe 1800s with old journals of a store’s transactions and accounts (“the books”), which were filled out with a pen dipped in ink.  This was “technology”. It then moves totypewriters that replaced the pen, and PCs that replaced the typewriter.  These were shifts that had to be planned for or the risk of being out of business was real. These past shifts give us insight into how to plan for future shifts.

We always reserve the last section of the museum for future technology; something that represents what IBM PCcomes that can make a difference and must be planned for. We see things here that are part of the Internet of Things (IOT), 3D printing for product development and someday delivery, voice command technology, and so much more.

It is all a continuum of technology we help clients understand and implement. The past into the future.  Maintain the now, and plan for the future.

You may want to know how to implement this approach, contact Keystone today to start that discussion!

 

Technology Frontiers, Part 2: IoT and Voice Recognition

We previously wrote about some of the technology frontiers we are exploring, and described three that are exciting:

  • 3D Printing
  • Clustered Computing
  • Latest Applications, Operating Systems, and Devices

But much like exploring a new area untouched before, we have two more that are both exciting and showing real promise for the future of technology and how it affects our lives.  These are the Internet of Things (IoT) and Voice Recognition, especially when paired with artificial intelligence and machine learning.   We describe both of these in this article.

Internet of Things (IoT)

The Internet was originally an environment where we hooked our computers to the internet provider and started using email or the world wide web (WWW). Humans were clicking links, watching a video, and sending an email. We initiated the majority of traffic by our explicit and direct actions, predominately in a web browser.

But the use of the Internet as a super-highway for information has changed: now devices and things are generating most of the traffic that is zipping through our data lines. In fact, Cisco did a study that estimated that “Data created by IoT devices will hit 507.5 ZB per year by 2019, up from 134.5 ZB in 2014.” (source: ZdNet Article: http://www.zdnet.com/article/cloud-traffic-to-surge-courtesy-of-iot-says-cisco/). In case you are wondering, a “ZB” is a Zetabyte, or 1 billion terabytes – and that is a lot!

So what is the Internet of Things (hereafter abbreviated “IoT”)?  It is the accumulation of the devices that are connected to the internet and generating (and sending) or receiving data.  It is sometime analogous to Machine to Machine communication (M2M, no humans involved).  Some examples:

  • Your cell phones’ GPS coordinates while you are using maps
  • A Nest thermostat in your home that you can connect to and raise the temperature, and which “learns” your life’s patterns to automatically start managing the system based on your history.
  • A location based tracking beacon to show you where your keys were left behind.
  • Public trash cans that use real-time data collection and alerts to let municipal services know when a bin needs to be emptied.
  • Wireless sensors embedded within concrete foundations to ensure the integrity of a structure; these sensors provide load and event monitoring both during and after construction.
  • Activity sensors placed on an elderly loved one that monitor daily routines and give peace of mind for their safety by alerting you to any serious disruptions detected in their normal schedule.
  • And so many more…

In every case it is some device that is communicating data, not a person directly doing so.

Based on the utility as well as the total data being collected, we can quickly see where this can explode.  Instead of you personally collecting and transmitting data, a device will do this for you. It is in effect what everybody dreams about when you think that your refrigerator will send a list to the local grocery store for items to replenish (and by the way Amazon now offers a “Dash button” that is designed to order some common household items at the push of a button).

Voice Recognition

We are using Voice Recognition more and more every day, in applications like Apple Siri or Google Now, or when we call into an automated messaging attendant at an insurance company and say our date of birth or policy # to a computer, or use voice to text capabilities. You have likely used one of these recently, but never really thought about it. It has become commonplace, but is expanding to be an option of choice for interacting with data.

Like most people, I interact with a lot of email; usually between 100-200 legitimate emails per day that are critical.  Although I am sitting at a PC, I tend to grab my iPhone and use the microphone key to answer emails using my voice.  A quick press and I am orally stating my response, or sending a new email.  I also use Dragon products on both Windows and Macintosh OSX to generate larger documents.  In fact, this article is about 95% voice generated on a Windows laptop with Dragon Naturally Speaking.  I use it to dictate the text, select text and apply formatting like bold or italics, and other advanced capabilities.  I confess that I do not type very well (if only I would have joined the mostly female typing class in my high school!), so the ability to use my voice is a tremendous advantage. It is not only a convenience; it is a huge productivity boost; I have generated documents of thousands of words in an afternoon.

And while I love the ability to simply state my words and see them appear in an email or Word document, when I see them combined with artificial intelligence such as Siri or Microsoft Cortana, it provides a truly personal digital assistant – one that knows what I am looking for. Here are some examples.

  • On my iPhone, I long press the home button and Siri pops up, and I say “When do the Cleveland Browns play?”, and Siri responds orally and on screen with the opponent and date/time of the next game.
  • Voice recognition in CortanaOn my Windows 10 PC I ask the same question and Cortana (the Microsoft voice persona) answers the same basic info, but on screen she also shows the probability of victory for the Pittsburgh Steelers at a 70.2% chance today. And by the way, Cortana has been 140-84 through 16 NFL weeks.
  • On my Windows PC, I can ask “what documents did I work on today?” with my voice, and see a list of everything.
  • On my iPad, I can ask “What is my schedule tomorrow?” and see and hear a list of my appointments.
  • On almost any device, I can ask, “what is the temperature over the next 3 days?” and get a nice forecast for the next three days (it is getting colder…brrr…).
  • On my iPhone, I long press the home button, and say “Remind me to let the dogs in in 10 minutes” and a reminder is created that dutifully goes off 10 minutes later.
  • On my Android tablet I say “Ok Google”. Then “email to John Smith”, “subject Client X need”, “Message We need to call them back today” and it sends an email with that info to John on my team.

In other words, I can ask questions that are personal to me (what is my schedule?) or from my world (“what is the temperature over the next three days?”) and get a context specific reply. Or I can give instructions to do something I need (“remind me in 10 minutes to let the dogs in”).  It seems like I am asking a human who knows what I want, and they give me a reply that is appropriate for the context in which I asked.

These functions are easy to use, and I highly recommend that you try them out.  If you want a place to start, try one of the following:

  • On your Windows 10 PC, click the Cortana microphone and says “Help Me Cortana”, she will show a list of suggested capabilities to get you started.
  • Try the same thing on your iPhone, hold down the home button until it responds, and says “Help me Siri” to get a list of suggested actions (you can also configure it to respond to “Hey Siri).
  • On an Android device, try saying “Ok Google”, then say “help”

What you can see is that your devices can interact with you on your terms.  It is not perfect, sometimes we see the famous and usually funny (and sometimes embarrassing) auto-correct responses when we use our voice, but overall it is really working quite well.

Summary of Technology Frontiers

There are waves of technology shifts that represent new frontiers for users and business organizations, and each represents some questions: What is this?  How can it help me?  What are the risks? We are looking at these so you know we have an eye on what may make a difference for you!

This week, some of us from Keystone will be at the Consumer Electronics Show (CES) in Las Vegas, which is the largest expo of technology directed at consumers and organizations that serve them.  We are excited to continue to dig in and see what is coming down the road that will affect all of our lives!

Technology Frontiers

The word frontier can be defined as “areas near or beyond a boundary”, and when we think of those who are “frontiersman”, we may think of ancient explorers, or the crew of the Enterprise on Star Trek who were exploring “space, the final frontier.”  These frontiers are new and exciting, but also fraught with risks and unknowns. We have gone through many frontiers in the information technology industry.  Looking back in my life we have had several: the move from mainframes to PCs, from character based operating systems to Windows and Mac graphical user interfaces, from local area network client/server applications to web based applications, and from PCs to tablets and other mobile devices. There are always new technologies and they drive change in how we operate and live and communicate.  Think of how the pony express system and telegraph allowed people to settle out west in the 1800s, far away from the civilization they knew in the eastern cities.  Similarly, today I write this at home while connected to my office and team via web, email, cell phone, and Skype for Business for chatting and sharing documents and screens. I am not 2,000 miles away, but I could be and it would be fine for what I need to do now.  This was not possible 20 years ago and yet it has become commonplace, and in it we see that I am using several of the technology frontiers of my lifetime.

And even now, we at Keystone are working with new technology that you may not even be aware of.  Why do we do this?  Inherently we love technology, so if you did not even need us to do it, we would still geek out at the latest mobile phone, backup software, security patch, and other fun to all or mundane to many technologies. We just can’t help ourselves, but we know that not everybody can stomach the pain of the new frontier.  They call it “cutting edge” for a reason, and sometimes it means “bleeding edge”.  We would not subject our clients to a new technology unless we have a good sense of the risks inherent, and how to overcome them to get the best value in the safest way possible.

Here are some technology frontiers we are exploring now in our Research and Development (R&D) that you may see as commonplace in your future.

3D Printing

So far, this feels like early paper printing technology.  Have you ever sent a job to a printer and nothing happens?  Or have you ever sent a 2 page document only to get 100 pages of what looks like alien communication?  That is what 3D printing feels like now.

3D Printing takes the concept of a data file, with instructions for how an object is shaped, and combines it with plastic extrusion technology to “print” the object.  You load the plastic filament into the 3D Printer, and send the job to it as a set of instructions.  The printer is supposed to print the object by feeding the filament through a hot end extrusion nozzle (the “print head”) and dropping it into a flat surface.  The print head moves up and down, and the flat surface (“the bed”) moves back and forth and eventually your object is sitting there; ready to use.

But it does not quite work that well.  Sometimes it runs for a while and stops, sometimes it slams into the bed and melts a hole, sometimes nothing, sometimes a big mess of plastic, etc.  But when it works it is great!

Think about some of the implications for your life.

  • You want to create a 3D representation of a new factory floor plan to test your kaizen or lean model more fully – just design it in the 3D software by dragging and sizing objects, and send to the printer. This reduces the time to prototype saving costs and improving flexibility.
  • Can’t find the battery cover to the remote control – just go online and download the design file and print a new one. No trip to the store, no tape over the batteries, etc.
  • Your client is not able to visualize what you are describing for your latest design for their building, and you are not going to make the sale because they lack a perspective needed to decide. Perhaps a 3D representation will help?
  • You need a new towel hook for the bathroom, but instead of buying online and waiting, you go browse designs, select and pay for one, and download and print.

At this point 3D printing has already been used to create new organs for your failing body parts, prototype new cars, create functional desk accessories, and help sell ideas.

It is new, it is exciting, and we are testing it now!

Clustered Computing

Most computing is one computer doing one or more jobs, and reliability and performance are based on what is in the machine’s box.  If you need more power, you open the machine and add more memory or disk space.  If the CPU is a few years old and not keeping up, you buy a new machine and rebuild everything. If you need reliability you buy one with at least 2 of everything you can: multiple drives, power supplies, and network cards. Performance and redundancy in this model are built on what is in the machine.

But if you could just add another machine and have it do ½ the work?  Or three more machines and they all share.  You now have 4 machines – 4x the performance, and if one goes down you run on 3 machines and replace the failed one as needed.  This is “clustered computing”.

It is not particularly new, and is sometimes called “Super Computing”, “Parallel Processing”, or “High Performance Computing”.  It was first conceived in the 1960s but required incredibly expensive hardware and custom software, and only accessible to organizations like the National Oceanic and Atmospheric Administration (NOAA) for use in weather studies.  In the mid-1990s new technologies allowed computer clusters to be built from commodity servers (search for “Beowulf Cluster”).  Suddenly organizations could build their own.  So at the same time that the internet was becoming available to everybody, the power of clustered computers became available to build search engines like Google and Yahoo!  (for a quick view of Google’s first cluster that looks like a Lego system, see this: http://infolab.stanford.edu/pub/voy/museum/pictures/display/0-4-Google.htm).

These capabilities are now becoming available in two ways:

  • Build your own local super computer from off the shelf parts. We are doing this now, using about $200 in parts primarily based on the Raspberry Pi motherboards. By linking 4 of these credit card sized motherboards that each have 4 “cores” together in a clustered network and using special software, we have what looks like one computer to a software application. In testing, using one unit in the cluster, it takes about 35 seconds to calculate the value of Pi to 16 digits on one core of one Raspberry Pi, but when we go to 4 Raspberry Pi units (16 cores) we are seeing times of less than 9 seconds!Raspberry Pi Cluster 2 - part of a technology frontiers approach
  • Rent space on a cloud provider’s platform and use it while letting somebody else (Microsoft, Amazon, Google, etc.) do the dirty work of managing the platform and the networking. See this for a quick description of Google’s current platform for this (https://cloud.google.com/solutions/architecture/highperformancecomputing).

One caveat of this is the necessity that your software be developed to run in a multi-node, multi-core environment.  You can’t just grab a copy of Microsoft Excel and expect it to calculate your budget faster (although oddly enough Microsoft has extensions to support this (https://technet.microsoft.com/en-us/library/ff877825(v=ws.10).aspx)!  Your software has to be designed for multi-threading, multicore support (you may see things like “HPC”, “High Performance Computing”).  The leaders in this area now are big data database packages like Hadoop that have to process incredibly high volumes of data in a short time.

This technology may not be ready for the average small to medium sized business, but it shows what is possible and could help with growth and seasonal needs.

Latest Applications, Operating Systems, and Devices

This is the most basic technology we test – the thing you see next week, or next month, or next year.  We have multiple devices and many different operating systems and applications that are in beta form, and we are trying them out so we have a perspective on what you may see, when you should move to it, and what the risk and reward will be.

In fact, I just had a lock up when I was writing this article using Windows 10 in an advanced preview copy, and Microsoft Word 2016 latest version; it does not occur often, but does happen.  We are evaluating the features and capabilities, the user interface, and the reliability (in this case I lost a few minutes but no data).Windows 10 Blue Screen - The result of working in technology frontiers

Some of the tools we are testing now include:

  • The latest suite of Office 365 Products, including Skype for Business
  • SharePoint and OneDrive for Business
  • Apple MacBook 12” with a beta version of Apple OSX
  • Cloud Based Information Security Systems
  • Amazon Echo
  • Beta Versions of IOS (on iPhones and iPads)

Summary of Technology Frontiers

There are waves of technology shifts that represent new frontiers for users and business organizations, and each represents some questions: What is this?  How can it help me?  What are the risks? We are looking at these so you know we have an eye on what may make a difference for you!

Next time we will catch up some more, and include some other technology frontiers like Internet of Things (IoT) and Voice Recognition!

Information Security: Essential To Your Organization

Information security is essential to your organization.  You store data in spreadsheets, Microsoft Word documents, an ERP or CRM database, and human resources and payroll systems.  This data represents your business and the way it operates, its performance, its future plans and its personnel.  These items are unique to your business and pose a threat if lost or compromised, it is therefore essential to your organization, and must be protected.

You might think of it similarly to how you put locks on your doors, but this is oversimplified and does not adequately address the problem. Unlike the doors of the building which remained static over a long time period, information is fluid and changes daily and has new threats on a regular basis. Consider the fact that antivirus software vendors release updates daily to detect new viruses, or that your employees are changing over time and their access and policies are shifting. And these are just the internal problems. Consider also that new information theft tools, and those who we refer to as “hackers” are rising up every day to attempt to get into your systems without ever even entering your geographical environment; in fact it is likely they are doing this from another country.

This all presents an environment which is much more difficult to manage than simply considering it like “locking the doors”.

Information Security Venn Diagram

In this article we want to share with you the three factors that you must consider as you attempt to secure your environment. These are the policies you implement, the the tools used to protect your environment, and people that affect your information security.

The Policies of Information Security

When we speak of policies we are considering this in two ways: the written philosophy and policies of information security, and the enforceable policies due to system settings or tools that are implemented.

The first is your written policies and philosophy about information security. Some organizations consider this an afterthought and do not direct their information technology team or their employees to consider security important or behave in a certain way to protect it.  Does your employee manual have specific language about what information is allowed to be accessed? Do you have a published directive on what the Internet can be used for while inside your network? These types of questions and many others must be answered and a specific philosophy developed through written policy to inform your users of what they should and should not do. This also elevates the importance of information security because it is being published. Finally, it serves as a way to enforce your policies through potential employment disciplinary actions if they are not followed.

The second type of policies are implemented in the tools that are intended to protect your environment. For example, your organization likely has a “firewall” and this device manages the traffic in and out of your organization, permitting some and preventing others. These are known as firewall rules and are intended to limit the exposure your environment provides to the outside world. A specific example of one of these rules would be to not allow access to third-party public email servers from inside the network.  This would be intended to prevent viruses from coming in through Hotmail, Gmail, etc.  This may be a written policy that says employees should not access the systems from their company supplied computer, but the firewall helps you enforce the rule by preventing access.  Other examples would include a password length or complexity requirement (including a time requirement to change passwords), file directory security on a shared folder, and many more that your information technology team would implement. These would likely be very unique to your business and should be considered carefully and then monitored for changes as time goes on.

You can see where these two types of policies are an important consideration in your organization.  If you are not talking about these at least on some level annually it is highly likely that you have an insecure environment.  It is important to consider them both to provide information security, as well as provide access to your authorized users to what they need to efficiently perform their duties.

The Tools of Information Security

The tools of information security are the technology components which are intended to protect your environment. These include the ones you probably know about like antivirus software or your firewall, but they also include things like active directory security, monitoring and logging tools on system access, server patching for vendor supplied security updates, and many more.

Keystone has found that no one information security tool can effectively provide peace of mind by itself. If you have a great firewall, it also has to be effectively managed, and your users who bring a USB stick from home have to be prevented from accessing it on a work PC. In other words, the firewall by itself does not protect your environment.  In fact, we tested all of the popular antivirus tools and found that none of them could trap more than approximately half of the attacks they encountered. We determined that we had to use multiple tools in the environment carefully aligned together so that they would not prevent users from working effectively, while at the same time reducing the surface area of the risk.  Some of the tools you should consider for continuous review include:

  • Firewalls
  • Patch process and level of all components particularly servers and firewalls or routers
  • Physical security particularly to servers
  • Antivirus and Antimalware software
  • Active Directory configuration (this controls who can access data)
  • Personal computer updates
  • Administrative (or “root”) access on any system
  • Security and Access logs

Once again all of these should be reviewed on a regular basis because new threats arise, and new tools are available to counter them.

The People of Information Security

The people of information security include your users, and your information technology personnel.

Information Security and Users

A good IT team can go a long way towards protecting you, but unfortunately users can undo a lot of their efforts.

We recently had a third party perform a security audit of one of our client environments. This environment includes over 50 servers and 300 workstations.  The audit was limited to perimeter testing, which means that the firm performing it was not inside the building or on the internal network and instead was posing as a hacker from outside. We received the results and were pleased that out of the entire environment there were only seven issues, two of which were previously known.  What surprised us is that some of the issues referred to internal server names, as opposed to how they may be accessed from the outside. Further investigation revealed that the servers which were accessible to the security audit firm were behind the firewall and in no way available from outside. We asked the security audit firm to clarify that they did a perimeter security check only and they revealed that they had done phone calls to users posing as somebody who was authorized to receive their account and password and were told that over the phone. This then gave them access to inside the network where they ran scanning tools to find vulnerabilities.

In other words, the servers of concern were inside the network and were “protected” except for the fact that users had given out their credentials to somebody who requested them over the phone.

It is essential to train users and keep them continually updated on the policies you have written and threats they may encounter, and also the importance of information security for every individual. We recommend at least annual training, and regular bulletins to users about common security threats.

Information Security and your IT Team

Your information technology team will need to have two essential attributes:

  • IT Security Skills
  • IT Security Mindset

The first, IT security skills is the knowledge and capability to secure all of the components in the environment. This will not be possible with one or two IT people, as they will not understand the deep components of the technology involved in order to lockdown everything adequately. They may understand firewalls, but not know how to secure the email server which is available through the firewall. They may understand how to update antivirus on a desktop, but not know how to implement a server policy to audit for unauthorized access by internal employees. You should have an inventory of all of your systems and identify those people who are capable to secure them. Once again this is an ongoing process as the systems, and the threats to them, change on an almost daily basis.

The second is an IT security mindset. This means that the personnel who manage your systems and support your users have a top of mind awareness that information security is important. They will be making decisions on a regular basis as they design, implement, and allow access to systems. For example, if a user is having a problem editing a document some IT employees with a low regard for security will allow wide-open access so that that is no longer a potential inhibitor for the employee. A person who has IT security as their mindset will consider this as a potential issue, but would never remove all access rules. You may have a person on your team like this, but it is essential that everybody who manages systems and access is aware of this. Your IT team also needs regular training and outside assistance.

Conclusion

Information security is essential to your environment, but it is difficult to attain and maintain.  The constant new threats, varied and changing systems and personnel, and ongoing business needs make this difficult.

Keystone can help you with this. We have a security mindset because we understand you look to us to help you attain a good business flow of data and keep it secure. We have a large team of individuals with skills in all of the common systems in your environment, like email, firewalls, network file shares, etc. Once again all of these have to be considered, or else it is similar to locking nine of your doors and leaving the tenth unlocked.  We have a large enough staff to consider all of them.

We work with numerous companies and therefore get an opportunity to see all of the types of threats that you may encounter: the threat we see today and respond to is the one you may encounter next week. We often already have a plan to stop it before you ever see it.  We help you develop written policies, provide user training, and create an environment where information security is interwoven in a way that assists your business rather than prevented from performing basic functions.

We constantly review and refine our toolset, and usually include the entire package of tools in our services so you have a comprehensive mindset, team of people, and toolset to protect you.

Contact us today to see how we can review your environment and help you implement an information security solution that adds value.

 

Learning from the New York Stock Exchange’s Technology Failure

The New York Stock Exchange (NYSE) experienced a serious technology failure this week of approximately 3.5 hours, after experiencing reduced functionality for the first 2.5 hours of the trading day.  The NYSE is of course a very high profile, internationally critical component of our financial systems.  System wide failures are extremely rare, and when they do occur they are publicized.  This allows us to consider what happened, and what we can learn from it that may help you.

What was the Technology Failure?:

The NYSE has numerous software applications that are integrated to provide a cohesive system for access and control.  There is the core record keeping system, systems to manage the process, customer systems to control accounts and execute trades, systems that monitor activity for fraud, etc. These systems exchange data with each other at various levels, and are dependent on being compatible and reliable.

On Tuesday evening, July 7, 2015, NYSE administrators applied an update to one of these systems to support a change in how the industry timestamps transactions. On Wednesday morning, July 8, 2015 the NYSE started noticing issues with communications between systems and applied an update to the customer system, this in turn created more issues.

The problem was not resolved, and at 11:30am the NYSE shut down trading and continued to work on the issue. At just after 3:00pm, non-updated backup systems were brought up in place of the production systems and operations resumed.

A quick synopsis can be seen here: http://www.cio.com/article/2946354/software-update-caused-nyse-suspension.html

What do we learn that is applicable?

What does you SMB sized organization take away from this?

We may be able to continue operations. The NYSE must have a level playing field to allow everybody to execute trades at the same time, or else fraud or inequality of opportunity become an issue.  Your business may be able to continue operations without a complete shutdown if one function is limited or creating any data issues.  For example, if your customer service system is down and orders via the web cannot be taken, it may be possible to place a message holder informing customers they can call customer service to place an order.  You may need to temporarily reallocate staff to handle more call volume, but customers can still be serviced and a more intimate conversation take place during the transaction.

Systems are complex, especially multiple systems that communicate with each other.  Software, especially software designed for a specific organization and use, can be complex. The luxury of waiting for others to test it in the real world is not present.  So testing is essential and it must reflect the real world: real data, real transactions, real systems that mirror the production system with the changes tested applied.  The testing must be broad, rigorous and deliberate, and results must be tracked.  Automated test tools can make the process more efficient, but they are just pieces of software and must be setup and used correctly. When multiple systems are involved and dependent upon each other, they all must be exercised.

Disaster Recovery works, but is a choice to execute.  In this case, the NYSE decided to cut over to the backup systems to continue operations.  This is not the same thing as pulling a server out of the closet and installing everything and going back to operations.  This is a “hot system”, one that has all of the live data but was not updated with the errant code.  It is not a small decision to cut over, as there is normally a cut back process when issues are resolved, but one they could make because they had designed the systems for it.  This allowed them to resume operations while still dealing with the issue. Most small organizations do not have this capability, but they can, and can have it for a very economical price.  My firm, Keystone Technology Consultants, offers this for even very small clients of 20 users. It is not just a peace of mind issue; it literally allows an organization to continue operations and keep the flow of work and money, and maintain their reputation and client relationships. It is essential.

I would love to hear your view of this, feel free to comment below.

 

 

Office 365 for SMB – Share without Tears

Office 365 for SMB

Microsoft offers a hosted service for organizations seeking the ability to have advanced email, file hosting, team sites, collaboration, and a lot more.  Small to Medium business needs this, but the cost to build and host it yourself is prohibitive. Microsoft has built world class tools on world class platforms and offers it at a monthly fee that is palatable and flexible.

Keystone Technology Consultants has become a premier partner for this service, and uses it across our client base.  We highly recommend it for the right situations.

You may be interested in knowing more, and one way to get that is the following slide deck.  And when you are ready, call Keystone to see how this would help your organization.

The Future Outsourced IT Model?

 

Outsourcing is sometimes viewed as a dirty word in organizations, and the Information Technology (IT) department is right there with that sentiment.  C level executives looking to improve the organization’s bottom line by outsourcing non-core areas create anxiety among employees who become fearful of losing their jobs.

Traditional Outsourcing

The general thought behind outsourcing is to look for the areas in an organization that are not critical to products or services that drive the viability of the organization.  Any function that is not directly related to why a customer may buy from you is reviewed for potential outsourcing.  This is based on the assumption that an organization that focuses on a function will do it better than one that does not.  I work in an IT consulting firm; we provide services by way of having talented individuals working as a team to provide services to clients. We cannot outsource that function, because they are the way we can show a profit, and the reason people would view us favorably (based on past successful work using the same team, tools, and methodology).

But we don’t clean our office.

Not that it is dirty, but that function is not core to the services we provide.  We contract with somebody to come in daily and clean.   They do it efficiently and to our satisfaction while showing a reasonable profit.  We cannot do that by using our higher paid employee base; it would make no sense.  We outsource that non-critical function to somebody who views it as a critical function.

We may not have as many employees because of this, but the ones we have share our focus of IT services.  If we used to have our own cleaning staff, the desire to outsource would have cause significant stress for them, and been a paradigm change for us.

Therein lies the reason outsourcing is sometimes a dirty word.

Some organizations view IT the same way we view cleaning; as a non-core function that can be outsourced.  Large companies such as IBM, Perot Systems, and Hewlett-Packard make significant revenue from providing outsourced services.  We make a reasonable profit in certain clients from providing outsourced services.  If your organization manufactures widgets; then maybe the IT function is not core, right along with cleaning, payroll, tax accounting, and even sales, among others.  None of these need to be provided by somebody working exclusively at your direction, and for your singular purpose.

Everything I have alluded to implies outsourcing applies to people – reduce the headcount managed in a function by finding somebody else to perform the service, and even manage it.  This can result in the employee leaving work on Friday working for company “A”, and returning on Monday to the same cubicle but employed by company “B”, possibly with reduced benefits, career opportunities, etc.  Company “A” saves capital, and can focus on core functions. This has been the outsourcing method used popularly for the last 20 years or more.

New Technologies Change the Terms

But new technologies change the terms of outsourcing: instead of focusing on the headcount of people, focus on the specific function.   These technologies are known as Service Oriented Architecture (SOA), and are made possible by using the internet as a transport layer between systems that are made available to provide services (known as “Web Services”) to each other.  While these technologies cannot help us clean our office (that is after all a physical function) they can help us in many other areas, and any business process provided by application or even system software can be reviewed for outsourcing.  This capability was not available before technologies like the web and web services.

Service Oriented Architecture (SOA) is a model that ties disparate systems together using communication interfaces that allow them to talk to each other in industry standard terms.  You might consider it an evolution of long established Electronic Data Interface (EDI).  But EDI, while still providing value in many functions, has two inherent problems.  First of all, it is focused on the format of the messages, not the means of actually transporting them.  Because of this, so called Value Added Networks grew to provide connection services and clearing houses between two systems that could not natively talk to each other. This increases costs, and reduces flexibility.  Secondly they only cover specific transactions that a standards board has set.  So you can transport a Purchase Order and an invoice, but not a flight reservation, request for properly formatted +4 zip codes, or product specification; some things that various organizations would be very interested in.  So EDI has a standard, but it is limited to specific functions and specific data elements within those functions.  This in turn limits an organization from responding to their own needs.

What SOA provides is the ability of two organizations to agree to outsource a function, but not necessarily a person, as has been the approach to outsourcing. We are not implying that no employment will be lost, but are viewing future outsourcing as being service or function based, as opposed to being team and job based. The employee headcount may be reduced or held steady as a side effect, but the focus is finding some web service that can process information and return it to us better than we can ourselves.

Business Process Engineering and SOA

Businesses have spent a lot of time thinking about the process they used, and they have defined and redefined it using Business Process Reengineering, Six Sigma, Total Quality Management, Continuous Improvement, etc. ad nauseam.  Many improvements were made, and that unique business process, designed for the organization that employs it, has been encapsulated in the organization that uses it.  It is hidden in business rules and databases, web pages and email clients, ready to be forced on users at the appropriate time to make sure the business process is followed. This gives rise to ERP systems, and large Line of Business applications (LOB) that perform functions across an enterprise, following a predefined business process to ensure uniformity, predictability, and cohesion among business functions.  Take for example the simple Zip Code + 4 function: an organization wants to ensure that every mailing is successfully delivered to the right location, and so as part of the process they run the address against a computer application that returns the correct nine digit zip code.  In this way, costs are reduced, and the correct targets receive the content in the mailing.  But sometimes the rules or even underlying data change in this specific function.  New addresses are added, street names changes, zip codes redefined, etc.  This may necessitate a change deep within the LOB or ERP, and your organization, with its very organizational specific business process built into the system, is the only one capable of making the change using internal employees familiar with the code and structure.  You bear the full cost of the change, and can only amortize it across your organization.  But the Zip Code + 4 function is NOT core to the business. It needs to be done, but it is not how you actually generate revenue.  The content of the mailing should do that, but the commonly used Zip Code + 4 does not.

The architecture organizations have traditionally used over this re-engineering period is monolithic.  They built single system, or tightly integrated systems, that they owned or highly customized to the point of not even looking like the application they procured.  These were often hidden behind a firewall, focused on the broad and internal needs of the organization’s process.  What they did not invite was the ability to simply integrate one little external function into the business process. So the Zip Code + 4 function remains locked in the LOB or ERP.   This limits flexibility to reuse and share the function, expand it to new acquisitions; in short to generally focus on the core process.

But what if you found another organization, that focuses only on this functional need, and which would allow you to connect your business process directly to theirs? You send them the address, and they send you back the properly formatted address with the correct Zip Code + 4?  That is what SOA and web services can provide.  You can pick from various vendors who will provide the service, selecting the one that best meets your quality and budgetary needs.

Summary

This provides in the end an outsourcing model that looks at specific functions within a business process, instead of specific departments or personnel.  You may reduce headcount, and in fact likely will, but that was not the focus. The focus was on improving quality and flexibility, and at the same time turning your attention to the core functions and systems that encapsulate enforces those functions.

This may be the future of outsourcing as a part of the process of reviewing for opportunities to do so.  I invite your thoughts on the subject.

 

Document Retention and Protection

Organizations generate a lot of data, and usually want to protect it. They store it in reliable media, secure it from unauthorized use, and back it up. This information, especially historical data such as sales history, contracts, emails, etc. is essential to make decisions pertinent to the future of the organization. Many organizations use a Document Management System (DMS) to store them. These systems then ensure they are versioned over time so that it is easy to find them, and see the various changes that were applied, while at the same time enforcing retention rules that dictate what must and must not be kept. This forms a repository to ensure data is both secure and can be located easily.

Which made it interesting to know that Google, king of search engines, was tripped up by a search engine on documents in their own possession. (Read full article here)

Google has been sued by Oracle for use of a programming technology called Java in their Android phone and tablet operating system. Most lawsuits have a discovery period, in which each litigant turns over all documents pertinent to the case, but can exclude any client-attorney privileged documents. For example, original blueprints of an alleged copycat product must be turned over, but a letter between an attorney for the litigant and the engineer which discusses the blueprints is protected and can be held back from disclosure.

In order to facilitate discovery assembly, legal analysts use tools similar to the Google search engine, which allows them to collect information from across the organization’s systems using keywords like “Java” while at the same time excluding documents with phrases like “attorney work product”. In this case Google was caught by a draft email stored on the server; one of 9 drafts before the final. This particular version stated that Google should license the Java technology from Oracle. But because it was an early draft, it did not include the key phrase “attorney work product”, and was released to Oracle’s attorneys. It may change the case completely.

This illustrates the key point: that you need to know what you keep and what you should not keep, and how it is encoded. Document Management Systems can help with this, by setting up and enforcing retention or coding rules, they can automatically add a tag, or delete documents over a certain age.

Google shows us a great lesson; what do you learn from it? If you would like an opportunity to discuss how you can put the right systems and polices in place, feel free to place your feedback below.

3D Printing and Piracy

Information security is always important, and when not designed and enforced properly creates holes that allow unauthorized access to your critical data. Any exposure of salaries, customer credit cards, financials, and product designs may be potentially fatal to your organization. To the extent that your business processes information, your loss of data may be even more damaging. If you are a bank and suffer theft of information it may be the very reason you go out of business. But if you are a manufacturing firm, loss of data does not necessarily mean loss of product control. A bank loses this with loss of data; a manufacturer does not.

But with the advent of 3D printers that can “print” a product from a design file may allow somebody to quickly generate the same product you manufacture without you even knowing it. And now one of the most notorious file sharing sites, The Pirate Bay, has an area for “3D Designs”. This means your product design in the wild could be copied freely like MP3 music files are now. While there is not a large market now for 3D printing, it is a growing manufacturing area that as it attracts attention, will also attract pirates who can obtain your design source files, and easily run them locally to generate an identical part to the one you produce. They can start where you left off with “improving” the product and creating their own offering based on your initial labor.

A video of how these printers work

What this means to manufacturers is that they need to show the same concern about piracy and security as the recording industry does. It means ensuring that you are kept updated on operating systems and their patches; applying security rules to limit access, improving physical and logical access to systems and data, and other measures. As the recording industry has shown, once the cat is out of the bag it is almost impossible to put it back in, so looking at your system now and getting ready should at least be in your plans.

If you would like an opportunity to discuss how you can put the right systems and polices in place, feel free to place your feedback below.