I've been working exclusively in implementing BI solutions for the past five years. The thing that depresses me the most is not that BI solutions take forever to implement and cost a lot, but that clients many times just don't understand the data they are trying to report on. Many times it leads to over-engineered BI solutions just for one report that a client says is mission critical, but is never used.
I'm sure more technology focused companies don't have any issues using these self-service models, but you wouldn't believe the innumeracy that some people have in industry.
> one report that a client says is mission critical, but is never used.
My dad started developing software in the late 60s. As a kid (let's say circa 1982, definitely in the minicomputer era), I remember him talking about a problem at work: to do all their daily processing, they needed about 28 hours. A lot of the workload was reporting, so he asked managers what reports were no longer useful. Naturally, he was assured that every report was absolutely vital to proper functioning.
His solution was just to start dropping reports. If anybody complained, he'd put them back in the job list. A significant number of reports went unlamented, and soon the computer was able to complete its daily workload handily.
The lesson I took from this is that expressed desire is often very different than actual need, so separating the two can pay big dividends. I've never used that trick, but the lesson runs all through my methods.
Well, the only issue I can see with that now a days is if one such report is actually needed only for compliance purposes. Then, five years down the line, an audit happens and 1825 copies of that required daily report happen to be missing...
Then someone should be verifying that they're being created. Same idea with backups whereas you're suppose to check and verify them, even if you're not "using" them.
There is an interesting phenomenon that I've noticed about BI: By the time data is appropriately gathered, cleaned, aggregated, and presented, the data is so closely aligned to the decision process that the decisions themselves might as well be automated and optimized.
But that never happens. We build reports so that executives can look at them and feel important while they make slower and less optimal decisions than computers could make.
Whilst I agree with the feeling, I think a lot of BI devs have a tendency to throw the baby out with the bathwater.
Decision making is a complex process. The graphs and data fed to the executive via BI are just inputs to the deep net of his brain, which has been trained on the "data" absorbed over decades of experience. The objectives themselves are not simple to model - management is a delicate balancing act between competing stakeholders. The job of the BI professional is not to just produce what he is told, but to figure out what problem the person making the request is trying to solve, and then solve it in the simplest way possible. Occasionally this requires teaching them some things.
Taking an example: imagine you have an engine vibrating normally. You want to set up an alarm that rings if the engine vibrates abnormally - specifically, adding a new frequency to the existing signal (maybe it indicates a screw is coming off or something). You can feed the signal as is to your algorithm, or you can put it through a FFT in which case the "signal" is just a bunch of peaks at each frequency, and your algorithm is literally just a switch (if peak at frequency f reaches amplitude A, trigger alarm). The switch is orders of magnitude simpler, cognitively, than the algorithm that is fed the raw signal; it's also likely to be more accurate. Feature engineering is almost the most important part of statistical learning.
The executive is like the alarm - pre-processing the signal is your job. They are used to simple tools, usually univariate and linear, at a stretch, some can deal with simple polynomials. The better you pre-process the signal, the easier it becomes for the executive to make a correct decision by associating the new data to whatever decades of experience he trained his brain on.
A concrete example: let's say your CMO has asked you to give him vouchers and new customers for the last 6 months. He's clearly trying to establish the relationship between his voucher campaigns and new customers. You can give him the vouchers and the new customers, daily/weekly/whatever, and put it on a nice Tableau graph and give it to him and forget about it... or you can confirm that the problem he is looking to solve is indeed the relationship between his campaigns and new customers.
At which point you ask why new customers? And you find that he has a theory that gaining new customers is the best way to increase revenue, and the objective of the company is to increase revenue (due to incoming fundraising round whose valuation is based on revenue and revenue growth), but it is short term cash flow constrained (hence looking at vouchers instead of, say, marketing spend).
Since he probably doesn't know that a model can have more than one variable, you explain that to him and brainstorm what other variables might impact revenue growth. Assuming you're trying to predict new customers per income statement dollar and new customers per cash flow dollar, you might find that online marketing spend and season are two significant variables, and that there is a significant interaction term between vouchers and marketing spend of certain types. It's now your job to explain that "formula" - standard error of coefficients included - to the executive, and brainstorm what output is required to make him be able to quickly check how this input has changed over time (which might just be.. an alarm). You'll also have to explain the measure of fit you are using (R-squared almost always wins by virtue of being very intuitive).
And of course, this is what I identify as the gap that none of these BI products can ever hope to fill. You need someone technical with full grasp over all the data sources of the company - AND the implicit and explicit data model of the business - who also happens to be continuously involved with management discussions and at least moderately aware of the business. Most companies have a set up whereby the BI team is some sort of self-service restaurant where the executive swoops in, gets his request processed, and swoops back out. Many prefer hiring young, inexperienced BI staff because BI is seen as a cost centre, and because the way they "scale" requests is by adding headcount. One offshoot of this is that the executive starts wanting a Tableau, something will all the data neatly prepared that can be drag and dropped into the 1-dimensional models that he uses to pre-process his company data.
The upshoot is that it's not that executives are stupid, but more along the lines of GIGO. Without the tools required to make sense of the signals they receive, executives cannot make the right decisions even if they have the right experience and thinking. I suspect a large part of why more experienced executives are smarter is that they learn to spot trends over decades of experience based on very simple signals; for example, an experienced hedge fund manager will sniff out a fraudulent company much faster than someone who has just started, just by looking at the financial reports.
Forget the data. Many companies don't even understand their own processes. I've gone to so many HR departments and asked them how their hiring process works and met blank stares, because there wasn't a single person who knew the entire process from start to finish. Each individual knew bits and pieces, and made assumptions on how the rest of the process worked. Even though my main role is software implementation (enterprise), I spend a ton of time interviewing people and putting those pieces together. It's probably the least favorite part of my job.
Yep. Any exec or manager looking to buy this service as a solution to their reporting needs should sit down and have a serious talk with their analysts or teams who build their reports first.
Innumeracy is why the mantra of asking "Why?" instead of "How?" also applies to BI reporting. A flashy new tool or report isn't going to help as much as having a builder with industry knowledge or experience.
Technology focused companies can have the same issues. In order to build a clear understanding, it requires not just having the data available (the RIGHT data obviously, and clear and extensive documentation on specific definitions and business meanings), but it also requires education within the organization of how one should look at, evaluate, and understand the different pieces of a business.
The good thing though is that there's a tipping point, where once ~1/3 or so of people fully grok how to evaluate and understand the metrics they're looking at, those who do understand start helping (or calling out) those that don't, lifting competence throughout the organization.
I know marketing directors of large retailers who can't figure out what 10% off a figure is, and get confused as to which is gross and which is net. End up explaining VAT at least once a week.
It's cool to be innumerate, though, and if you can't do this stuff yourself there's always a nerd to blame somewhere nearby.
Having worked in some form BI for almost the entirety of my career now, there is not a single, consistent form of BI dashboard that is prevalent across any company. Every solution ends up being unique, because every company has a unique data set-up, stakeholders, definition of metrics, and access needs.
I've worked with Tableau, Domo, Oracle products, you name it. What's the solution that is passed around the most? Excel sheets, because they travel easily and have all-around permissions.
I've been waiting for an out-of-the-box solution that's at least relatively easy to leverage across different organizations, but I haven't seen a painless one yet.
I'm hopeful that Quicksight, while not the be-all end-all solution, provides an example for others to follow, if it does end up being easy to set up and use.
I'm not really a BI guy by trade, though I've developed related skills over time. My expertise is in communications. Specifically responding to RFPs/RFIs/SOQs, and marketing collateral.
Whenever I see a program such as this, I'm definitely impressed.
Then, a little voice comes into my head, one from having spent years in the RFP and presentation trenches...
"Put this in a PowerPoint slide."
Point being, for internal use it's nice, but it may not be that different from what other software already does with proper data input (e.g. Excel).
I lead a team that develops a more advanced BI toolkit, that apart from supporting all typical BI scenarios, handles PPTX export very well (generating 50+ quarterly presentation for investors, etc.). If you're interested, drop us a line at www.binoclebi.com.
I use it regularly. It's not so much about sharing,which is easy and works well, but about getting data into Domo. Pain point to format data and schedule exports, update them, set up new exports, etc.
Got it. Is it a data-source you're pulling in from a connector, or from an on-prem system? We've recently added almost 100 self-serve connectors. https://www.domo.com/connectors
I've been talking to the product team a lot about related feedback lately. If you want to share your thoughts, I'd love to take them down, and make sure they get to the right folks.
Also, have you seen the new export API? It could solve part of the problem you mention. adam.chavez at domo
Just wanted to reemphasize sibling comment's petition for feedback on Domo. I work on the data pipeline team at Domo, and we love hearing ways to make the process simpler for our users. If you haven't already, be sure to ping adammichaelc with your use case and problems.
At Telemetry (https://www.telemetryapp.com) we're trying to solve this issue which is quite a bit more difficult than it initially seems. We're not a BI tool though, just a transmission, management and visualization layer.
Keeping dashboards persistently up and running on televisions, provision mobile devices with granular permissions and enterprise features like SSO. We provide an agent to integrate on the client side to help send metrics from whatever data source you may have. Take a look, I'd love to know your thoughts.
Excel also has a pretty easy to understand xml import,
At my a new gig I've just avoided having to write add 'just one more report' to the fragile homebrew report designer by letting people pull out structured data and make their own reports. Code required was trivial & the risk was super low as we could just use existing API methods to get the XML out.
This has caught my eye though, so I'll have a tinker in the morning.
I'd be interested to learn more about this, I've never heard of anyone importing XML into Excel before - typically "flat files" or SQL datasets are imported into an Excel table, upon which you can quickly build a pivot table, etc. Are you pulling in nested data?
I'm the founder of Chartio.com here. Not sure if you've given us a look yet (or lately) but we pride ourselves on being as usable and flexible as possible. Would love to show you more: dave@chartio.com
We use Chartio at my company. If you haven't tried it yet, this product is stellar - almost all of our BI needs are handled directly by stakeholders rather than having to go through an engineer, and their support is very helpful and responsive.
I would consider moving off to something like Quicksight if it supported redis. Some of our BI-related data is stored there, and currently to get at it we have an app that proxies data from there to postgres for Chartio's sake.
It is terrific that your end users are willing to spend the time and effort on building their own BI solutions. In my experience, at least with C-levels, that is not the case at all. C-levels also tend to want the most stuff.
I agree, but I must point out that in some cases the alternative to the "time and effort" involved in the exec playing around with a particular end-user report tool is the time and effort involved in a series of meetings attended by the exec and everyone on the chain between the exec and whoever is actually building the reports.
If "not able to specify a report format to a software service, even with assistance-as-needed from more technical users" disqualified one from executive employment, would any firm actually be hurt by that?
There are so many BI tools around that it's hard to figure out which solution is going to be best for a particular use case. Creating further confusion, it seems like most "enterprise" BI products aren't explained properly on their websites and are hidden behind "request a demo". It's nearly impossible to evaluate all the possibilities without going crazy.
I want to be able to generate my domain models in some way. Point and click data descriptions are awful. Letting certain people work with raw data is fine, but a lot of users are going to want to work with names that make sense to them. Let me define models with text, just like ORM models.
I want row based security. Let me assign groups to values on certain models. This essentially boils down to hidden filters and required tables.
It should all be web based. I'm not exposing my database directly to customers.
It should definitely not cost 100k a year.
I like the idea of QuickSight, but I can already see that it's not going to work for my needs. But at least they give an upfront description and price. Here's hoping the pricing model drives down the crazy license fees the other vendors are extracting.
I think the big issue with "bi" is that the term is just too broadly applied to be meaningful anymore. Every piece of software with a chart in it somewhere is billed as BI these days and moreover if you ask 10 business users what they want out of their BI system you'll get descriptions of several very different tools which all happen to produce charts. I'm sorry to say I've seen countless businesses deeply disappointed in their BI tools simply because they assumed since "it's a BI tool" it will do X,Y,Z when in reality there are a bunch of different classes of tools to choose from with very different features. The few options that really cover the spectrum are extremely complex and costly to really get value from over the long term. There's a reason why the analyst industry exists - it's just too much damn work to suss out what all these tools actually do on your own.
I make BI tools for a living and I am still struggling to really do a great job classifying all these types of functionality and communicating it well enough to lead customers to the tools they really need.
Hi @jsmeaton, this is Vincent from Holistics (www.holistics.io).
While our website also has a "request a demo" button in our landing page - It's because we have just launched and are looking to validate some of the use-cases that we have built. Will really appreciate if you can contact us and share with us your thoughts.
Will you mind dropping us a note on our website for us to contact you? It will be interesting to get your feedback, and I suspect what we have built (or are building) may meet some of the things you've listed above, though I still need your validation.
A brief introduction about us. We started off as an internal data dashboard for an online video streaming company, serving a specific reporting niche (not a full fledged BI tool) with use-cases different from BI vendors such as Quicksight or the other vendors.
We're trying out Apache Spark with Apache Zeppelin and it's been a pleasure so far. We faced the same problems that everyone else mentioned here -- data is not accessible to people who need it and every datasource requires different tools.
What we like about Apache Spark is that it can take any source and provide the same very fast and programmatic (code reuse!) interface for analysis. Think JSON data dumps from MixPanel, SQL databases, some Excel spreadsheet someone threw together etc.
Apache Zeppelin is a little bit limited in the visualization that comes out of the box, but the benefits of having a shared data language across the company is just such a huge plus. Also, super easy to add data visualization options and hopefully companies will start to contribute these back to the project.
For me, the most interesting part is SPICE. To implement a data warehouse, one creates a star schema (OLAP) database from a regular OLTP database. This involves massive amount of work. It looks like SPICE aims to replace the need for OLAP database and produce similar data directly from OLTP systems. I would love to know more about this engine. I hope Amazon open sources the engine (I highly doubt that they will do it).
There are a few similar technologies out there some of which are open source. Prestodb comes to mind but there's also some other closed systems like powerbi (with some underlying azure distributed data warehouse bits), etc out there.
What's happening in the industry I think is that there was a first wave of data discovery products (tableau being the prime example) that seek to get you right to reporting and data exploration side without necessarily having to slog through all the data warehouse design, and now a second wave like powerbi, etc that are pure saas plays that do something very similar are starting to come out.
Amazon takes risks, and ships innovative products and services every month. Their risk taking is relentless and they take failure in their stride. They "get" how to do push.
No other tech company comes close to their pace. And the key to that is the juicy under the radar micro manager that is Jeff Bezos. If there was an award for best tech CEO. 2015, i'd nominate him in a flash.
I'd say Andrew Jassy is the person you are looking for. Jeff Bezos doesn't give a shit about the day to day strategy of AWS or any other existing Amazon business. He has implicitly said as much in his speeches at Amazon. He really only cares about the next billion dollar business to build within Amazon. Jassy leads AWS, which is the only technologically innovative part of Amazon. The retail side of Amazon is a technological black hole...a few standouts swimming amongst a sea of half-implemented non-solutions to problems that no longer exist but can't be changed because legacy.
That definitely matches my understanding. As a long-time user of both Amazon and AWS (respectively 1997 and 2006), their product management strategies seem very different to me.
A lot of Amazon stuff smacks of high-level micromanagement. E.g., their thoroughly failed phone. Or the disappointing feature mish-mash that is the Kindle app. On the positive side, the original Kindle was groundbreaking because of similar micromanagement.
But the AWS stuff feels much more bottom up to me. They start with some small, discreet notion. They trial it in private, getting feedback and evolving in careful response to users. When it's solid enough, they open it up for everyone. And then they keep iterating, making things gradually better.
I would agree with that. Small story: Working on the retail side of Amazon, one day I got pissed that if I wanted to use PostgreSQL that I had to set up my own EC2 server and EBS infrastructure and maintain it, whereas if I wanted a MySQL instance I could just use RDS. I had a few legitimate usecases at the time that required PostGIS, so I was annoyed about it and I vented on a few mailing lists about the lack of PostgreSQL in RDS.
In a move that I would have never seen in the retail side, a product manager at RDS emailed me and set up a meeting with me and a director and VP. He invited me to make my case for having PostgreSQL as an RDS option. For about an hour I explained why the existing options didn't fit my use case, how there was a burgeoning market that was waiting for it due to Oracle's mismanagement of MySQL, and how there were several teams within Amazon that preferred the strictness and standards compliance of PostgreSQL but chose MySQL due to not having to manage it. They thanked me, and less than a year later there was a public announcement of a PostgreSQL offering in RDS.
I don't think I can take full credit for them launching it...they already had public forum threads of people asking for it and tons of +1 responses. But they actually listened to me, and they took into account my expressed desires to have several extensions available as well. That sort of bottoms up communication doesn't happen on the other side of Amazon.
One of the most destructive aspects of the mythology of modern tech culture is this ridiculous worship of CEOs, as if they are supermen and the thousands of creative people who actually build the products we enjoy are just the gloves these heroes wear. Stop doing this. You're devaluing the worth of everyone here.
One manager's decision can make the work of hundreds of people worthless or even destructive. I've lived thru and seen it. Like it or not the shot callers at the top wield enormous influence and the ones who make consistently good decisions should be celebrated for that.
You are right - one bad decision can destroy a project/product.
But the opposite is not necessarily true.
To ship insanely great things, you need a lot of factors to come together, not just one person's decision (although it helps).
Steve Jobs kept using ideas from the people who worked from him (presenting them later as his ideas). Yes, he had great intuition and good taste to choose the better ideas, but without the people who generated these ideas, he would have been yet another arrogant, loudmouth suit.
The culture which may launch a lot in some parts, but does not uniformly ensure quality and is well documented as being difficult to work in.
Amazon is far too large to give Jeff credit for 100% of the output, despite his name being on the door or ultimate decision making authority belonging to him.
Amazon has been busy, but I still have to go with Satya Nadella. Since he's gotten the job in early 2014, Microsoft has shipped amazing things. Many of his decisions have helped redefine company culture, for the better.
Amazon has shipped a lot, but I think the new Microsoft has a bigger impact. Off the top of my head: open-sourcing .NET, SSH to Windows, Surface product line, Hololens development, Windows 10 (hit a few bumps, but free is HUGE), cross-platform software push, stronger open-source commitment...
Satya has made some huge changes, and they're moving, but it's a beast of a company to push.
Win 10 still has a chance to hit a home run and make his name, but only if hard decisions are made. That 6.63% market share could be easily 25%+ right now, somethings gone wrong and they need to look at that and fix before users find the alternatives.
A lot was started before he took over, but there's still been a dramatic change since it happened.
When SSH was announced, the team said they tried twice before and were shot down. This time, they got executive support. I bet there were many great, open ideas that Ballmer shot down that Nadella would approve of. My point being- of course good things were in progress before he took over, but they seem more likely to make it out the door now.
If I had a choice between working for Bezos or working for Joel Spolsky at half the pay, it wouldn't even be close. Of course, in real life, Fog Creek probably pays more anyways.
Fog creek is nice, but you're working on small localized projects compared to Amazons scale and breadth.
Amazon employees are pushed hard, and that's their key to shipping innovation.
Look at Googles cuddle farm - Lots of innovation that rarely ships and no risk taking.
How about Apple - Constrained innovation, low risk, Once a year shipping.
And there's a hundred other CEOs and companies that just don't come close. From an investor standpoint, Bezos is the gold standard of post-IPO CEO. Risk taking, innovation, shipping. The dude's on point.
He understands risk taking is the key because returns on hits are 10-1000x your investment. Look at EC2. That can cover the cost of 1000 "firephone" style project failures. But you gotta get it out of R&D and into the market. You need to ship.
Under those metrics, almost all of Amazon's products have been flops and none are not any way similar to the cash cows that Google and Apple have managed to develop and maintain without succumbing to competitors.
I don't think that having to run rust to stay in place the way Amazon does is a sustainable business strategy.
As an investor I'm not looking for Thiels "sustainable business strategy", I want growth. New products, new markets, pushing the envelope.
And that comes from taking risks,, shipping, and dealing with failures. Google and Apple are simply not doing that in a meaningful way, and their share price reflects that.
I'm not sure I agree with this about Amazon, but some of it may just be strategy opinion differences. Apple and Google both have higher capitalizations, but more importantly they operate at growing profits as opposed to a loss. Google's share price is also substantially higher than Amazon's.
Bezos has definitely made some good decisions, but I think attributing too much to one person (correct or incorrect) is a bit of a problem. Either the company fails without that person, or the perception is that the company can't succeed without that person.
Regardless, my personal opinion on Amazon's overall retail strategy and execution is honestly pretty bad.
He strikes me as someone that had a couple brilliant fundamental insights, which compose the core of the business. But if one starts thinking everything one shits out is gold on the basis of such insights, one ends up with the Fire phone. However, that sort of thing is not likely to be a true problem for the business for quite a while. They can waste a lot of money and time before it ever becomes a problem.
I'm glad to see they put a little more effort into the product page for this release than seems typical for AWS products. It's much easier on the eyes and, at least for me, much more readable.
A lot of what you are paying for in BI solutions is implementation costs and then to a lesser extent yearly maintenance and support costs. I wonder how they intend to lower implementation costs, since that is kind of lengthy and inherently difficult process.
This is the direction most of the major players are trying to go.
I thought the same when I saw some of the visuals as well. I'd say Microsoft has the advantage in the enterprise/corporate space. Everyone uses Office already, and Power BI is included for free as add-ons to Excel since 2010. Power BI collaboration portals are also free for the equivalent of Amazon's $9 tier.
It's cool to see competition in this space. The real power isn't in building a better BI tool for BI professionals - that's pretty much a solved problem.
The problem is capturing and leveraging the business knowledge that lives in Excel spreadsheets or Google spreadsheets on business users' own drives. That's where a lot of this excitement comes in.
The Power BI is an incredible product - thanks for listing it here... unreal! Coming from SAP/BusinessObjects products which are generally overweight and expensive, Power BI nails it... Price is free, they offer a download for Windows, web app and appears to have a mobile app. Worth a try. In 5 minutes, I had a dashboard built out of exported data from our accounting system...
Co-founder and CTO of Mode here. I'm not too concerned that visual exploration is going to replace SQL for ad-hoc data analysis or data modeling for business intelligence. I think proprietary viz platforms are strictly inferior to open sourced ecosystems like Vega in any case. Open ecosystems win over closed ecosystems in the long-run.
We're more interested in integrating the tools and ecosystems people are already using in novel ways than we are in being yet another dashboard or visualization tool. Dashboards and visualizations are incredibly valuable but they're just one of the many ways that analytics teams deliver value to the organization. There are a lot of "jobs to be done" for an analytics team and the list isn't getting any shorter.
Agreed re Tableau being a competitor. But I'm not surprised they're not a partner. Last quote we got from Tableau for desktop was $2k. This comes in at around $200/year. That is probably where the "1/10th of the cost of traditional BI solutions" comes from at the top of the home page.
Tableau are pushing their web system over their desktop tools. It is perfectly possible to use the free reader to consume data without needing hundreds of licences for a cloud product or the full edition.
It's possible, but last time when I used, it's pretty limited. You would need to manually feed local data and re-sync it when it changes. Nothing something you want to do a lot.
What is the point of this then? Tableau can directly connect to most of these data sources already. Is this meant as a replacement for a data warehouse to take transactional information into a format that is quicker to report on?
EDIT: After reading into this more it seems like Quicksight is Amazon's version of Tableau and connects to SPICE which is Amazon's version of a data warehouse. If you prefer Tableau, you can apparently connect that directly to SPICE. If you already have Tableau connected to a data warehouse, this would appear to be a new competitor to the market and wouldn't add anything to your specific setup (beyond Amazon's claims that they will do it better).
Tableau offers both desktop and server edition for data visualization. AWS QS also does data visualization but doesn't offer a desktop version. They are competitors. Once Amazon enters a market, game's over. It becomes a race to the bottom red ocean.
The thing about Tableau is that they're really the industry leader in terms of data exploration workflow. Users love the tool so much that they expect any new data system put in place to work with it so even vendors that are trying to put out their own data exploration ui are partnering up with them to provide the best-of-breed sort of option where they can still sell you the data system, but your c-level who will not even consider using anything but Tableau can still use it.
I'm not sure about that because they say that SPICE is an in-memory query engine unlike BigQuery which is a distributed query engine that fetches data from a persistent distributed file-system. It seems to me more like Spark in-memory RDDs that caches data fetched from external data sources.
They also support incremental loading similar to Periscope.io which is actually quite cool. I would be great if they could give more information (syntax etc.) about SPICE.
> Built from the ground up for the cloud, Amazon QuickSight's Super-fast, Parallel, In-memory, Calculation Engine ("SPICE") uses a combination of columnar storage, in-memory technologies enabled through the latest hardware innovations, machine code generation, and data compression to allow users to run interactive queries on large datasets and get rapid responses.
The description sounds awfully similar to Spark, yet they say it's "built from the ground up", which I assume means "from scratch".
Now that's a nice acronym! I think Amazon have skilled people in charge of marketing. They make their announcements feel exciting but not too "markety".
On the one hand, you're totally right about the annoucements. On the other hand, the names they give their AWS products are pretty mystifying. AWS Snowball?
AWS Snowball is a play on AWS Glacier. Glacier is the slow moving practically frozen storage system. Snowball takes the same stuff and hurls it rapidly to a destination. Pretty slick.
And as I understand BI, also for when you want complex graphs that can't be easily modeled in Excel.
With BI solutions you can perform "slices" as they call it, of multi-dimensional data (cubes), and then represent that as graphs that can also be used for drill-down on one or more dimensions of said data.
When you have say 17 dimensions, these solutions are easier to use than using excel to try to do the same.
I have only implemented very simple BI solutions a couple of times, so anyone with more experience can correct me if I'm wrong.
It's all just pivot tables, in the end. Excel solved most data slicing/drilldown problems years ago. Only issue has always been data size limits and access to live, up to data data (though Excel's also been able to hit external OLAP services for data for a long time too).
I'd like to affirm your claim that Pivot Tables and Excel can match pretty much any BI related inquiry where data is available. Caveat - the data will fit in Excel
Source: Experience in Wall Street, Fortune 500 risk management, and dumping stuff from SalesForce to make it useable beyond what our implementation would report (or what leadership could get it to do).
Excel gets a bad rap. You can do a lot in Excel including slicing multidimensional data and pivot tables. You can also pull data externally and connect to external data pipelines.
Excel is point and click and allows for a wide rang of programming skill. There are also a lot of plugins and of course you can work in offline mode. It is also easy to integrate old Excel data etc... And you don't need a full time programmer who knows javascript etc or some other language to produce something.
That doesn't stop people from trying. I'm sure some manager in some company is angry at his IT staff because he is trying to load 10GB of statistics into Excel and his staff is telling him that he shouldn't do that.
Surely if you're that large then you could invest in some in-house domain-specific solution. There are whole programming languages dedicated to doing statistics on datasets.
Isn't that the whole point of AWS and their competitors? The savings of passing the problem off to domain experts to work out all the details often outweighs the benefits of having a solution that is 100% customized for your business.
Not to mention that if you want this built for your company you call up the BI provider and tell them to send some consultants over to build it for you. You might train one person or possibly two at the most to be able to make tech calls with the BI provider. Much easier for a company to say "I want a dashboard that reports X, Y, and Z" and then the BI company scurries off, builds it, brings it back to you, and you hand over some cash for their effort.
Hey, what's the difference between R and a library for Haskell (or even Python)? It's interesting that a language that is limited to a particular use (statistical analysis) has taken off so much instead of a library for a general purpose language.
You helped answer the questions yourself. The fact that it was built for statistical analysis means that the syntax and overall flow is REALLY easy to understand and manipulate data with. I imagine someone who is familiar with Python would gravitate towards Pandas or another library, but for the BI that I do every day, R is perfect.
People love saying that BI/Data Science is 80% cleaning data (which may or may not be true), but I've found R to be the best for cleaning up 100k+ rows at a time.
Since 2010, Excel has had a native add-in called Power Pivot which can handle 100M+ row fact tables and provides a full dimensional modeling experience.
"Too big for Excel" is, quite literally, a problem of the last decade.
Building your business around carefully constructed employee ignorance is a dangerous strategy. It basically assumes that they won't do any useful thinking. That was fine circa 1930, where treating people as dumb pairs of hands was an effective strategy because there was so much manual work to do. But the more we turn rote work over to machines and software, the less this will work.
Agree with your comment wholeheartedly. My comment was mostly around business information flow to the market since that's regulated due to various reasons.
I think a valid question is why these threads aren't being merged like is common practice and happened to a number Microsoft threads during yesterday's event. I'm not sure Hololens has more in common with a Surface Book than Quicksight and Snowball.
I would have greatly preferred those threads to not be merged.
It's not a particularly good user experience to have a bunch of different unrelated products being discussed under the same article. It's a lot easier for me to just click 'next' and go to page 2 if I'm interested in reading more articles than it is to keep track of what everyone is talking about in a comment section with hundreds of posts and dozens of threads.
People are actually genuinely interested in what comes out of re:invent. Because you don't use Amazon's services or don't like the company doesn't mean they paid for placement. They paid a lot of money to developers to develop great things. Why do you have to take that away from them?
Are you implying that all of the big tech companies are paying for upvotes on hackernews? Do you think that's more likely than a tech focused news site thinking that these events have interesting announcements?
All the major tech companies bundle their announcements like this in order to make a bigger splash. It's common on such days for HN to have many announcements from the same company. As long as the stories are significant and it happens rarely, this seems ok.
Edit: we buried the "preview" stories, though. That's too much when there are already several actual launches on the front page.
I'm sure more technology focused companies don't have any issues using these self-service models, but you wouldn't believe the innumeracy that some people have in industry.