Choosing the Right Backup Solution

This week’s guest blogger is a member of IT Central Station’s Elite Squad – Chris Childerhose. Chris is a Technical Specialist for Storage, Virtualization & Backup. He’s published reviews of EMC Data Domain, Veeam, Nimble Storage, VMTurbo and other solutions. 

Every Virtualization and System Administrator deals with having the ability to recover servers, files, etc. and having a Backup Solutoin to help with recovery will ease the burden. But how do you know which one is right for you? How would you go about choosing the right solution that will help you in your daily tasks?

Software Criteria

When choosing a backup solution there are many things to consider based on your physical/virtual environment. What hypervisor are you running, what storage is being used, etc.? The best way to choose the right solution for the job is through evaluation and the more you evaluate the easier it will be to pick the right one for you. During an evaluation process you should consider things such as:

  • Compatibility with your chosen Hypervisor
  • Ease of installation and setup
  • Program ease of use and navigation
  • Backup scheduling
  • Reporting – is the reporting sufficient enough
  • Popular within the industry
  • Support for Physical and Virtual servers
  • And so on…and so on….

There are many criteria you can use in the evaluation stage and the above examples are but just a few. Composing a list prior to starting to look at software would be the recommended approach, this way you are looking at software that will fit most of your criteria prior to the evaluation/PoC stage.


When you have completed your criteria list and selected vendors for evaluation ensure to install all of them. Installing all of the products allows you to do a side-by-side comparison of the features you are looking for like job setup, ease of use, etc. Being able to see the products and how they work side-by-side gives you the best evaluation experience.

During the comparison stage look at something like ability to conduct SAN based backup versus LAN – how does each solution compare? Can the solution connect in to your SAN fabric allowing faster backups? If you cannot use SAN backups how will it affect the overall performance of the environment? After backups complete is there a reporting structure showing success/failure, length of time, amount of data, etc.? When working with the solution is navigation for job creation/modification simple? Is it cumbersome within the product and/or frustrating creating backups?

There are many things when comparing products to be aware of and answering questions as you go through the products is a great way to evaluate them.


Remember that there are many backup solutions out there for evaluation and choosing the right one can be a difficult decision. Evaluating the ones that appeal most to your organization is the best way to go and using a methodology for testing them is even better. In the end you will ensure your success by choosing the right solution for the job! Evaluate…..evaluate…..evaluate

The Weekly Roundup: Tableau

This week’s roundup explores what’s being said in our collection of Tableau reviews. Tableau is one of the fastest growing BI tools in the market. There are so many solutions on the market and it’s hard to know which is best for your organization. Today, let’s see what our real users are saying about Tableau.
Here are a few highlights direct from IT Central Station:

· User Bernhardsmith who is a consultant at a financial services firm says “it has some of the best conceived “template” graphs I’ve seen in any package”

· Doug Lautzenheiser is an owner of tech consulting firm, says that “Tableau is the type of visual analytics software that Microsoft itself should have added to Excel.”

· Bob Samuels is working at a media company with 5,000+ employees gives it 5 star rating but says one of its limitations is its price. It is not cheap.

· Ted Cuzzillo, an industry analyst, says that in Tableau “workbooks can be passed around in a variety of ways forever”

· Marc A is a BI Expert gives it 4 star rating and says that performance of Tableau can be tricky on large datasets .

· Guillermo Cabiro who is R&D Director at tech consulting firm says “It has a full pivot, drag & drop and drill down capability that’s great for or power users”

· And finally, Lazarmihai who is a project Manager at a software R&D company gives Tableau 3 star rating saying it is very flexible and simple for a user to create reports with parameters and filters.

Click here to read more business intelligence reviews. If you haven’t already, sign up with IT Central Station, browse reviews, follow your favorite products, or write a review of your own!

Will Office 365 change the SharePoint vendors focus?

This week’s guest blogger is Marwan Tarek. Marwan is a Microsoft MVP and has been working with SharePoint since 2005. His areas of expertise include ECM and collaborative technologies.  Contact us if you would like to be one of our guest bloggers.Marwan_Tarek

Office 365 is a comprehensive platform that delivers main pillars like email, calendar, collaboration (including search, document management…etc.), unified communications and social.

Microsoft keeps adding to this platform like Project Online and PowerBI.

Office 365 is not an isolated platform, it works in tandem with Microsoft Azure to extend its services through Azure websites, Active Directory, and more to come.

The platform is fully managed by Microsoft and supported Microsoft SLA.

Having that said, how this will affect the existing SharePoint ecosystem and Microsoft partners specifically?

I would categorise existing Microsoft partners into:

  1. Boutique services: they deliver software services in the shape of custom developed solutions on top of SharePoint and consultancy services.
  2. Products companies: they develop ready-made products that utilise or serve SharePoint as a platform. For example workflow products, governance management, administration, back and restore, custom web parts, custom HR solution, ideas management…etc. You can check a lot of these products on
  3. Hosting and platform management companies: they provide managed services to clients who want to outsource the hosting of their own SharePoint platform.

Let’s see how each category will be affected:

Boutique services are the least affected in these categories. However, they should adapt and understand the change and the vision. Microsoft is pushing all the custom development to be outside SharePoint in the form of apps hosted on Azure websites or develop custom applications (websites, windows apps, mobile apps…etc.) that utilise SharePoint as backend; the applications will connect to SharePoint (or Office 365) using the new Office 365 APIs.

Products companies will need to reassess their strategy, review their market segments and how their clients are flexible to the new changes. There are clients slower to change or may be rejects the cloud concept.

In my opinion, the companies focusing on the platform management like upgrade and migration, back and restore, administration are hurt by the new move. In Office 365 there is no new versions that need upgrade or new farm that requires content migration. These companies needs to repurpose their products, move up in the technology stack (rather than focusing on the platform move up to the application).

The companies building ready web parts or solutions on top of SharePoint, they will need to re-architect their solutions and keep a close relation with Microsoft to stay to top of any upcoming platform changes.

Hosting companies are the most affected category. Simply they are going to lose all of the clients who are going to move to the cloud. It is not only about SharePoint; most of the clients move the email and unified communication workloads first then SharePoint follows.

Cloud strategy is an important item on all the CIOs agenda; either in the short term or long term. That’s why all IT professional service firms need to re-innovate their offerings, focus on maximising the business value for their clients and divert the focus from IT only solutions

Disclosure: My company is a Microsoft partner

Upcoming Event: Software Quality Conference

software-quality-conference-logoThis week’s guest blog post is by Douglas F. Reynolds. Douglas is president of PNSQC (Pacific Northwest Software Quality Conference). PNSQC’s mission is to enable knowledge exchange between relevant parties within the software quality community, to produce higher quality software.  PNSQC provides opportunities to demonstrate, teach and exchange ideas on both proven and leading edge software quality practices. This year’s annual PNSQC conference will take place October 20 – 22, 2014 in Portland, Oregon.

Let’s review the reasons attending PNSQC 2014 can help to build your Bridges to Quality.

PNSQC 2014 provides for rich interaction with leaders in the industry. Six experts explore new ideas in software quality. Our Keynote Speakers are Jon Bach on Live Site Quality & Richard Turner discussing Balancing Agility and Discipline in Systems Engineering. Check out all of our speakers at Keynotes, Invited Speakers and Workshops.

At PNSQC 2014 you will also learn from over 60 presenters as they reveal solutions, successes and issues facing software quality. Ideas you can take back to the office as these are tales from the trenches that can be put to immediate use. The complete program is now available with the Technical Abstracts.

At PNSQC 2014 networking is a priority throughout the day. Over lunch we offer Birds of a Feather that give you an opportunity to hear from both the experts and your co-workers. The evening social events are collaborative with the local software organizations providing for time to mingle with the locals and your fellow attendees.

PNSQC is a non-profit and the savings are passed back to you; register early and save. Join us for PNSQC 2014, register now!

How Does Your CEO Stack Up?

Tom_PuthiyamadamThis week’s guest blog post is by Tom Puthiyamadam. Tom is an Advisory principal with PwC, and leads PwC’s Digital and Customer practices. His work is focused on co-developing and executing strategies to increase an organization’s growth and operational performance. Tom has significant experience in global transformation services for Fortune 500 companies, specializing in growth strategy, customer strategy, organization design, marketing, sales, and service effectiveness.

Companies that don’t become digitally fluent in the next five years will fall behind. That’s a lot of pressure to put on a CEO, but it seems those who are embracing digital in all parts of the enterprise are already seeing returns.

In our recent survey of 1,500 business and IT executives, we found that 63% of top-performing companies had CEOs who were addressing tough issues like how to effectively collect and interpret and action data driven insights on their products and consumers; by contrast, only 44% were doing this in the lower-performing groups.

Here’s the degree to which CEOs are embracing the champion role by sector:


CEOs looking to claim this mantle can do the following:
  • CEOs need a strategy fit for the digital age. This would encompass growth,  productivity, customer experience, products and services, partnerships, risk and more. Yesterday’s obstacle is a new opportunity for digital.
  • CEOs must rethink the planning process. Asking the right questions at the right time is key for maximizing digital potential.  Digital is not the question, where to start and accelerate your digital journey is.
  • CEOs must own it. From the C-suite down, every leader must have a clear-cut vision of how they will take digital and build it into their strategies.  This cannot be left in hands of a single executive who is meant to create change through influencing many.   The CMO and the CIO are the best executive examples  of executives leading the charge but where is the rest of the C-suite.”

The not-so-silent majority: How we built our online BPM community

This week’s guest blog post is by Vasileios Kospanos. Vasileios is a Senior Marketing Executive at Bizagi.  If you haven’t already, check out the real user 5-star reviews of Bizagi here on IT Central Station.

I was particularly interested to read a post earlier this year by IT Central Station’s CEO Russell Rothstein. His blog post, ‘Listening to the Silent Majority’ made the interesting point that the overwhelming majority (90%) of individuals within enterprise technology circles fail to effectively convey their feelings about the products they use.suite_ft_quality

For many of us, this may strike a chord: as an end user of other enterprise products & services, engaging with a faceless corporate – however proficient our social media expertise – may seem like a lot of hard work… and for what?

So how can we engage this 90% and give them a voice?

As Russell has previously shown, it’s all about providing the tools, the support and ultimately, the end user value. At Bizagi, we rely on recommendations – take them away, and we have no business!

And like IT Central Station, make sure you give your customer something worth having: even better, go beyond their expectations. If you want people to rave about your products, then under-promise and over-deliver.

In our case, it’s our freemium model – not only can you get Bizagi software for free (2.5 million downloads and counting) there’s none of the usual caveats: no time limits, no restriction on user numbers, no notable feature limitations. The ‘what’s in it for me’ is quite obvious, but we think there’s more for the Community than some snazzy BPM modeling software – however good we think it is.

Bizagi provides a platform where the Community can openly share their user experience, exchange hints and tips, even develop and swap BPM apps – all of which adds up to benefits on both sides, says Jolanta Pilecka, CMO for Bizagi:

“It’s a two-way thing: the better your product is, the more your customers will recommend you and the more you can spend making that product better. Over 25% of our revenue gets reinvested in R&D – in striking contrast to the single-digit industry average.”

Our final takeaway: once you’ve got your customer, don’t neglect them: make them excited to be part of your brand. Our #FlyMeBizagi competition, which allows community members to gain points in exchange for more software referrals, has already seen our customers engage in healthy competition. We can’t wait to tell one of our users that they’ll be flying to Brazil for this summer’s FIFA World Cup.

Last week, Bizagi was named in the Top Red Herring Europe 2014 Business award winners list. We were delighted to see our ‘incredibly loyal customers’ and ‘impressive user community’ mentioned as reasons for making the cut.

So for all customers reading this article: this one’s for you!

How to Find Trusted Sources of Information During the Buying Process

Today’s guest blog post is by Chris Newton who is VP of Business Development at Influitive, the advocate marketing experts. Before joining Influitive, he started the first product-centric customer advisory board at Siebel, and created one of the earliest advocate marketing programs using Influitive’s AdvocateHub platform at Xactly. In his life before software marketing, he was a pilot in the U.S. Navy and earned an MBA at Harvard.

As someone involved in technology purchases at your company, you may be skeptical about some of the information and collateral that vendors ChrisNewtonprovide.

While some companies do a terrific job of truly educating their prospective buyers, others are guilty of producing self-serving, promotional content that leaves a bad taste in the mouths of tech buyers like you.

A recent study by the CMO Council backs this up. Its research found that “BtoB buyers and influencers are turned off by self-serving, irrelevant, over-hyped and overly technical content.”

This even extends to customer references supplied by vendors. While seemingly objective, there’s no way around the fact that companies cherry-pick the customers and experiences they want to highlight. In other words, they’re carefully presenting a positive – and unbalanced – picture.

To bypass this, B2B buyers are “migrating to peer-based communities and new sources of trusted, relevant and credible content and conversation,” according to the CMO Council report. Recent research from Forrester also demonstrates that customer advocates and category influencers are the most trusted sources of information for buyers today.


Lucky for you, these days you can discover plenty of unbiased viewpoints about a vendor and its offerings. In fact, existing customers of a vendor you’re considering are an invaluable resource that you can trust due to their independence.

Moreover, by tapping into the experiences, insights and advice of fellow B2B buyers, you can eliminate a lot of the legwork involved in evaluating a vendor and its offering.

Your peers can guide you through the selection process by:

  • advising on questions to ask
  • sharing competitive comparisons
  • explaining the implementation process
  • indicating the price you can expect to pay
  • and more

So, as you are evaluating options for a purchase, be sure to gather information directly from your own trusted sources, whether via a social network such as LinkedIn or Twitter, a vendor community, a product review website like IT Central Station, or in-person at networking events and conferences.

Not sure how to take advantage of the wisdom of your peers? Here are a few recommendations for how you can connect with other buyers and benefit from their insights:

  • Attend a vendor’s user group conference to interact with existing customers.
  • Send an email to trusted peers or post your vendor-specific questions on a relevant LinkedIn Group or online community forum.
  • Search for vendor names on social networks and third-party sites to unearth relevant conversations.
  • Seek out customer “testimonials” and reviews through third-party sites such as IT Central Station to avoid a “pre-selected” customer reference.
  • Take into account both positive and negative feedback about a vendor and/or its offering to develop a well-rounded picture.

Learn more about the role of customer advocates in the buying process:

Join me and IT Central Station Founder and CEO Russell Rothstein for a Google Hangout on March 13 at 1 p.m. EST. Our guest will be Eric Dirst, a former CIO who spent more than a decade leading IT organizations before being promoted to President of Online Services at DeVry Education Group. Register now.

Databases, Decisions, and Disruption

This week’s guest blogger is Eric Evans. Eric has over 20 Years of experience enabling organizations through multi-disciplinary technology strategies and agile information systems management. Go to IT Central Station to read his reviews of Solarwinds NPM, IBM Cognos, Oracle E-Business Suite and other enterprise solutions. Contact us if you would like to be one of our guest bloggers.

By now most of us are well aware of the data explosion, that businesses are creating more data than they can effectively manage. This is not aADD_Evans new problem. Throughout history societies have always made efforts to create repositories to organize, analyze and store documents (recorded knowledge). Some of these ancient repositories still exist today in the form of “brick and mortar” libraries. But just like anything else in a consumer’s market, demand (Time-To-Solution) eventually becomes greater than the supply (Information Available/Accessible).

The global economy is currently undergoing a fundamental transformation. Market dynamics and business rules are changing at an ever increasing speed. Those responsible for keeping the company on track for the future have a massive need for high-quality data–both from inside and outside the company. Technology decision makers are facing the challenge of having to create infrastructures that leverage speed, scale and availability.

Data technology must assist in the removal of silos and support collaboration and the sharing of expertise across the company and with business partners. Successful companies will need access not only to their own “Data repository” but to data from various heterogeneous sources. Today, finding mission-critical data or even being aware of all potential sources is more a question of luck and intuition than anything else.

How important is your data to your organization? How does your organization use its data? How do they access and interact with it? Are the decisions being made from data, innovative or disruptive in nature? What’s the value and impact?

According to a Forbes article written by Caroline Howard, “People are sometimes confused about the difference between innovation and disruption. It’s not exactly black and white, but there are real distinctions, and it’s not just splitting hairs. Think of it this way: Disruptors are innovators, but not all innovators are disruptors — in the same way that a square is a rectangle but not all rectangles are squares”.

Database accessibility is critical for rapid but sensible, innovative and disruptive decision making. A business database management system must be able to processes both transactional workloads and analytical workloads fully in-memory. By bringing together OLAP and OLTPL to form a single database, your organization can benefit dramatically from lower total cost up front. Additionally, gaining incredible speed that will accelerate their business processes and custom application.

SAP HANA DB takes advantage of the low cost of main memory (RAM), data processing abilities of multicore processors and the fast data access of solid-state drives relative to traditional hard drives to deliver better performance of analytical and transactional applications.

Fusing SAP HANA with a scalable shared memory platform will enable businesses and government agencies running high-volume databases and multitenant environments to utilize high-performance DRAM that can offer up to 200 times the performance of flash memory to help deliver faster insight.

Here’s my analogy: players go to the “Super Bowl” for one of two reasons, to watch or participate. To be successful in today’s global market companies must effectively participate or risk being on the sidelines watching.

Do You Throw Bandwidth at the Problem?

This week’s guest blog post is by Bruce Kosbab. Bruce is CTO at Fluke Networks – Visual. If you haven’t already, check out the real user reviews of Visual TruView here on IT Central Station.BruceKosbab

If you are a network manager you have likely faced two conflicting business directives when it comes to managing your network: 1) ensuring that you are delivering the optimal end-user experience with your network, and 2) reducing the operational cost of your network.

The need to ensure adequate end-user-experience puts constant pressures on IT to increase bandwidth in order to provide an effective service to the business, while cost management requires that bandwidth is limited, or even reduced. So, how can you manage these conflicting pressures?

Frequently, in situations in which there are persistent performance problems with an application the initial reaction is to throw bandwidth at the problem. However, often times you can substantially improve the end-user experience and reduce operational costs merely by using the bandwidth you already have more efficiently.

Informed Decisions

Throwing bandwidth at an application performance problem may be right answer, but this solution is not immediate. Ordering new circuits can take anywhere from 30 to 90 days to deploy. And, gathering data to understand the true bandwidth usage of a link can be time-consuming and error-prone.

  • Which network links need the most attention?
  • Is the bandwidth being used for business purposes?
  • Can I downsize a link while maintaining business service quality?
  • How can I demonstrate that an increase in capacity is warranted?

There are three common approaches used to manage network capacity:

  1. Long-range views of average utilization – shows a long-term trend of utilization, but traffic spikes and even brief periods of congestion are hidden by the highly aggregated averages
  2. Peak utilization – shows the days in a month that had a busy minute but doesn’t give insight into the amount of time during which time a link is congested
  3. Traffic usage totals – does not give any indication of congestion except in extreme cases

None of these approaches provides adequate information to make informed decisions.

The Problem With Utilization

Let’s look at an example in which average utilization is used. In this example, I’ve chosen a short time-frame, but it demonstrates the problem with average utilization.

The average utilization over the selected time period, Oct 9 through Oct 14, is approximately 45 percent. By looking at this information one could assume that bandwidth congestion on this interface is not an issue. If we were to use network utilization as a yardstick for bandwidth capacity planning then this interface would most likely not appear on our radar.

TruView Capacity Planning_image1

Data aggregation is at the core of the problem in using utilization for capacity planning. The utilization values in the above chart are aggregated into 2-hour intervals. This means that each point in the report represents an average over a 2-hour timeframe. This aggregation has a smoothing effect on the data that masks high-congestion periods.

To demonstrate the smoothing effect, let’s zoom in on 60-minute timeframe within the time period shown in the above chart. On the afternoon of Oct 11th the utilization peaks at around 80 percent, which is not evident in a 5-day view of the data.

This level of granularity is what is needed to truly understand the network utilization. The problem is that getting this fine level of granularity over a month or a year is not feasible because it requires a vast amount of data to be stored and displayed in order for the real utilization to be visually and quantitatively apparent.

TruView Capacity Planning_image2

There is a Better Way

There is a technique for analyzing network utilization, which Fluke Networks’ products use. It provides more actionable and accurate decision-making information. We call this data Network Burst data. Burst Utilization indicates the amount of time interface utilization is greater than specified thresholds.

By using Burst Utilization you can determine how long the congestion of a link exceeded 80 percent utilization or other utilization thresholds. With this type of information you can make decisions on whether to upsize (or downsize) a link or whether to investigate how the link is being used.

The advantage of Burst Utilization is that if link congestion levels can reported based on 1-minute granularity regardless of the reported time frame, a day, a month, a year, without loss of information fidelity. Contrast this with using average utilization over 15, 30, or 60-minute time ranges, which dampen the utilization trend, and make accurate capacity planning decision very difficult if not impossible.

Network managers typically want to begin to keep an eye on a particular interface when it spends more than 10 percent of time above 80 percent utilization. This translates to a little more than a half day out of a typical workweek. When the utilization burst reaches 20 percent time spent at the 80 percent threshold, i.e. a full work-day, then it may be time to either upgrade the link or investigate how it’s being used.

TruView Capacity Planning_image3

The chart shown above lists the interfaces being monitored and their respective burst data. The color breakdown indicates, for each interface, the time spent over 30 percent utilization (yellow), 60 percent utilization (orange), and 80 percent utilization (red). The interface listed first is obviously in trouble. It is running above at >80 percent utilization all of the time.

Gathering Data to Make a Decision

If we look at Burst Utilization for a given interface we can determine, at a glance, which days of the week and hours of the date are most congested.

TruView Capacity Planning_image4

And then, we can investigate whether the bandwidth is being used for business purposes or for recreational use.

In the chart shown below, it appears that most of the bandwidth is consumed by legitimate business applications. The 1755/TCP application bears further investigation though.

TruView Capacity Planning_image5

When all of this information shown above is accessible in a single solution and in one place in that solution, making bandwidth-sizing decisions can be quick and easy.

Call To Action

The goal of managing network bandwidth is not to report on the utilization of a link over time, but rather to ensure that you are buying the right amount of bandwidth to meet the needs of the business.

Please let me know how you perform bandwidth management:

  • Is WAN capacity management part of your standard process?
  • What tools do you use?
  • What are your biggest challenges in managing bandwidth?

Also, please take a look at Visual TruView from Fluke Networks. That solution can help you with your network capacity management chores, plus it can help you understand whether network congestion issues are indeed causing application performance issues.

Don’t throw bandwidth at the problem. Make informed decisions with the right data.