Founded in 2005, DCIG (Data
Center Infrastructure Group) is a group of analysts with IT industry expertise
who provide informed, insightful, third party analysis and commentary on IT
hardware, software and services.
DCIG independently develops and licenses access to DCIG Buyer's
Guides. DCIG Buyer's Guides provide actionable intelligence through
comprehensive, in-depth analysis of data center infrastructure product features.
DCIG also develops sponsored content in the form of blog entries, customer
validations, product reviews, special reports and executive, standard and
full-length white papers. DCIG's target audiences include C-level
executives, IT managers, systems and storage engineers and architects,
press/media, magazine and website editors, bloggers, financial and technical
analysts, and cloud service providers.
see also:- storage
market research companies list miscellaneous
consequences of the 2017 memory shortages |
.. |
 |
editor's comments:- DCIG entered
the SSD market research business in
July 2010 with a
Fusion-io
focused paper about SSD Architecture and PCIe SSDs.
But thereafter
DCIG wisely avoided focusing too much on drive level SSD market reports - an
area where it had little native expertise and a topic which came to be over
supplied by many
storage market research
companies.
In recent years (2014 onwards) DCIG has - instead -
focused instead on leveraging its enterprise based experience on a series of
buyers guides which focus on
different type of SSD box products (hybrids, rackmount SSDs etc).
These
guides have become well known in the industry - assisted in no small part by
publicity from vendors - whose products have received good reviews or
comments in their pages.
I had misgivings about the usefulness of the
scoring systems which were used in early versions of these systems guides.
The
main reason for my concern being that attributes used within such scoring
systems were not generally applicable within the enterprise SSD user base.
This
has been an emerging problem for the product marketers in leading SSD systems
companies too - and not just for analysts - for reasons analyzed in my
article - Decloaking
hidden and missing segments in the analysis of market opportunities for
enterprise rackmount flash.
So in my view - the early vintage
reports from DCIG were a mixture of valuable data mixed with inferences which
could be unreliable in many contexts. I fed my concerns back to DCIG - who
said they welcomed that kind of feedback. |
.. |
 |
.. |
DCIG ranks top rackmount
SSD vendors |
Editor:- March 31, 2014 - If you're interested
in
rackmount SSDs
then DCIG has published the
DCIG
2014-15 Flash Memory Storage Array Buyer's Guide (free sign-up page) -
which provides detailed comments on the strengths and weaknesses of rackmount
SSD systems from 20 different vendors - which are currently available in the
market today (includes list prices).
DCIG have created their own
multi-dimensional scoring system in which they look at component features such
as density (TB/U), software compatibility (for example ease of integration with
VMware), and management functions (dedupe, tiering, snapshots etc). DCIG has
ranked these systems overall - and compared many of them to others in the same
price band. Another useful feature of the report is a background story about the
design heritage or market history of each product.
Editor's
comments:- I've read the report and think it's a good read with respect to
the raw data and detailed observations about many of the systems listed.
As
to the product rankings?
I think whether you agree or not -
depends on whether you would assign the same weights to each constituent in the
confidential matrix of factors which DCIG have devised.
For some
users it will reflect your own priorities - for others - the scoring outcome
would be entirely different.
Among the SSD vendors listed in the
report - the happiest will be
Nimbus (who have been
crowing today
about being #1) - and happy too should be
HP (which is #2).
Some
vendors - whose products are best in class in a particular dimension - don't
score highly in the main list because they lose out on the "sum of all
things which DCIG think you might need" - which is an
application dependent
judgement - rather than being a universal "goodness" attribute.
The
only company which is conspicuously absent from DCIG's list (at any rank)
is
Fusion-io. Does DCIG
know something we don't? That's very odd.
Related articles:-
| | |
.. |
|
.. |

| |
|
.. |
|
.. |
|
.. |
DCIG publishes new edition
of its AFA Buyers Guide |
Editor:- September 30, 2015 - DCIG
recently
announced
a new edition of its All-Flash Array Buyer's Guide (60 pages, free signup)
which - from a desk based research stance - describes, comments on, and
compares in depth the features of key products in this category from 18
selected vendors in the market (AMI, Dell, EMC, Fujitsu, HDS, HP, Huawei, IBM,
iXsystems, Kaminario, NetApp, Nimbus Data, Oracle, Pure Storage, SolidFire,
Tegile, Violin Memory and X-IO).
Editor's comments:- One of
the roles for this document which DCIG suggest is as a "short list"
for quickly and conveniently getting your hands on consistently-presented,
in-depth datasheets for a market snapshot of products from a range of credible
sources.
As to how the sample list of vendors is cast - DCIG clearly
stated they do not merely rely on vendors paying them for inclusion in the
list. Nevertheless one of the problems with the authority of any "buyers
guide" is the degree of inclusivity and (by implication) the
transparency of filtering criteria.
When you include hundreds of
products in such a guide from all known vendors - then the sampling process is
transparent (and those not in the guide - need to make better efforts to
communicate with their market) but when you have a guide which samples only a
small percentage of vendors then inevitably questions get asked about how those
in the sample were chosen.
My guess on the representational value of
the companies listed in the guide is that it's compatible with the kind of
shortlist you'd get by sampling from 3 broad criteria.
- companies added into the list based on public revenue criteria and
corporate brand strength (to ensure inclusion of older, long established
storage companies)
- companies added into the list based on search strength, or social media
derived ranking rather than revenue (to ensure sampling of some newer
companies)
- companies added into the list for arbitrary reasons (maybe they've got a
particularly interesting feature which the authors want to discuss as a
counterpoint to others, or maybe the authors have some special relationship
with the company which means they know more about it)
It took me about
30 seconds after seeing DCIG's vendor list that the above (or some reverse
analysis thought process like it) is probably as good an explanation as any
for DCIG to have constructed its list.
I'm not saying that's how
they did it. But if you had to construct a vendor list of reduced size (and
DCIG does have to because - due to their format - it would be cumbersome,
repetitious and wasteful of analyst time to scale the guide to hundreds of
vendors) this is as good a way as any other - for the purpose of discussing
representational features in the AFA market.
So in that respect
(unlike others) I don't have any quarrel with the sample they've chosen.
It
sure wouldn't be my list. But DCIG's authors are aiming to produce a different
kind of guide and they see their added value as coming from their proprietary
vendor scoring criteria. And that necessitates a different kind of list.
In
a free competitive market - reports compete for your attention - just as much
as products. And you don't have to like every feature to learn something
useful from them.
DCIG's scoring criteria is where I part company
with DCIG's thinking. And this is a gulf I can't bridge.
I just have
to look away from these pages to prevent my crystal ball cracking for reasons
I explained when discussing an earlier version of this guide back in
March 2014.
I think the scoring concept intrinsically suggests a much more
stable, restricted and naive model of the SSD enterprise than is currently the
case. In some respects the scoring concepts are like a bridge too far and
sometimes to the wrong places and sometimes entirely missing some critical
destinations.
Nevertheless I'm sure DCIG's new guide will serve
adequately for many people who see things the same way as the guide
creators do and who like their way of doing things. So I'm sure there
will be more editions of this guide in future.
It's not DCIG's fault
that the enterprise SSD market resembles at times the navigational uncertainty
of Lost in Space (tv series) when in the very first episode the rocket
gets hit by a meteor storm.
In the SSD market we've been through a
whole bunch of similar cosmic disturbances and our rocket was launched with no
clear destinations in mind at the outset. The best we can hope for is plausible
pragmatic reinterpretations at convenient refueking stops.
BTW - I'm
not suggesting that anyone else could do a better scoring job by using different
methodologies.
Instead what I'm saying is that such a style of
analysis is inappropriate because of current
defects in
enterprise SSD market models and the general understanding of them.
While
that situation persists - such simplistic "winner" style guides run
the risk of advocating the essential flavor of beef to vegetarians. | | |
.. |
"When I talk to SSD
companies - an interesting part of the conversation is often trying to figure
out how products - which barely exist yet - will compete and fit into a future
infrastructure which doesn't yet exist either..." |
Boundaries
Analysis in SSD Market Forecasting | | | |