Do you think that Big Data has made 'gut feeling' irrelevant?

By George Hill, Managing Editor & Online Director at The Innovation Enterprise Ltd - Monday, January 07, 2013

Comments

Hi George - good question. No, the point of these technologies is to assist not replace human decision making. Experience, the "gut feel", needs to be partnered with what the instrumentation tells us about business patterns. What I can say based on our work, however, is that there are many instances of "gut feel" or well accepted truisms proving to be wrong when you actually drill into the data.

By Tom Deutsch, Big Data Solution Architecture. Program Director, Big Data Technologies focused on Hadoop / Cassandra and analytics - Tuesday, January 08, 2013

Tom, You make a good qualifier in the end here. The interesting thing about technology as you provide increased Ontological capabilities is an imbedded "gut feeling", if you will, into the technology itself. You see this by emerging tech companies coming out now that create better voice recognition software that can map your "meaning" behind requests, as well as more advanced technologies that can map behavior patters, speech patters, and provide predictive intelligence. The problem with "people" doing these is that new research is coming out that heavily de-authenticates the analysts capabilities of "catching" or "seeing" things over a period of a work-day. Thus, the trade-off of capabilities says the technology will replace the analyst and "gut feelings" once the repositories and ontologies get sophisticated enough to have "gut feelings" as they occur in our intelligent processing. Thoughts? Nik check me out on twitter @scalingstrategy

By Nikolas A., Executive Strategist - Tuesday, January 08, 2013

Intuition is most valuable when infused with informed data. Data can only take you the door of opportunity, intuition and the courage to act on it will walk you through. As a poker player I spend a lot of time studying the odds and the percentages. But like in business there's always a moment where the math turns up about even and I'm forced to hold or fold, it's instinct and intuition that determine what I do in those situations. Yet it's the playing and analysis of hundreds or thousands of hands that better informs my intuition. Data should infuse both tactics and strategy and be used to test your intuition after-the-fact.

By Nicholas W., Content Producer at Infomart, a Division of Postmedia Network Inc. - Tuesday, January 08, 2013

Hi Nik - sort of based on our work. The goal isn't to replace the analyst but instead free them to focus on higher order problems and/or spend less time collecting more time exercising the data and patterns. I'd be careful about suggesting that machines will have "gut" feelings, I know you aren't literally suggesting that but it is important to remember that machine learning techniques only inform as to data correlations rather than synthesizing net-new ideas. Put differently, they can suggest patterns once asked to work off a set of inputs but they aren't going to create new hypothesis as to data to look at. Make sense?

By Tom Deutsch, Big Data Solution Architecture. Program Director, Big Data Technologies focused on Hadoop / Cassandra and analytics - Tuesday, January 08, 2013

Nick: Good point, but need to make sure we do not fall into a "post hoc ergo propter hoc" by measuring decisions afterwards as a good or bad one. There is value to it, but the true test is understanding the how and why of a decision more than the results. Best, Nik

By Nikolas A., Executive Strategist - Tuesday, January 08, 2013

Hi Tom, we certainly agree in principle. In practice, I suppose we can have a fun semantic discussion on what we define "gut" to truly represent in cognitive processing. While the technology and advanced ontology is only "getting there", I would argue that the associatory power of data processing and correlations can lead to very complex algorithms that provide predictive "gut-like" capabilities that analysts are paid for. That being said, the pragmatic definitional approach of computer ontology is certainly different than that of human ontology. As there is an ongoing debate of whether computer ontology is not more epistemology than the philosophical intent of defining human cognitive ontology. More work needs to be done, but instead of look at where we are, seeing where we are going is very interesting when considering analytics and big data ontology. Fun Thread! Nik

By Nikolas A., Executive Strategist - Tuesday, January 08, 2013

Big data helps making better decisions, big data does not make the decisions for you. Business acumen, intuition and experience do play a role. Maybe less then before thanks to big data.

By Patrick Coolen, Manager HR analytics at ABN AMRO Bank N.V. - Tuesday, January 08, 2013

Patrick, You are correct: Big data, analytics, etc. offer nothing but the ability to make sense of distributed [and undistributed] amounts of data. That being accepted, with enough intelligence and the right algorithms, one can embed "human like" processing into the technology utilizing a prima-facie type coding for dispositions and valuations. Certainly we can both agree that an ethics conversation over the ability to do this perfectly to mirror that of a human is futile. However, can we not agree that it will be (if not is in some companies doing this type of work that I know of) possible to develop the right algorithms and ontologies to utilize the data and make "better" decisions than human? My use of better here implies the acceptance of human error based on many factors analysts classically experience (exhaustion from processing so much data, natural human stress and distractions, and a countless list can continue based on the human experience). We can agree that business acumen, intuition, and experience are not aways the best or most consistent graders for an employee more than anomalies of a system, no? High level, it will take very intelligent people, creating highly operable technologies that can access, "understand"/process, interpret, store, visualize, and then act upon information given a set of bounds. This sounds, if done correctly, very human-like: Which errors provide the most mission impact is certainly up for debate, but not that the technologies do and/or will exist to such a sophistication... It is certainly an uncomfortable thought, but one that needs to be seriously discussed for implementation based on mission vulnerabilities and the scale of data and technologies. Best, Nik (Twitter: @scalingstrategy)

By Nikolas A., Executive Strategist - Wednesday, January 09, 2013

Nik, I apologize in advance but I am going to start off a bit jokingly sarcastic, then submit a more serious thought. I use this joke only because it is my favorite movie series and I mean absolutely no offense, but I think the "Matrix" has you...or at least in your scenario to quote Agent Smith, "It is inevitable" that the Matrix will have you/us. Ok, ok, I got that out of my system. To be honest though, you are theoretically talking about AI; programming the technology to begin to think for itself in a manner that it can virtually adapt to an infinite set of variables. That is all well and good if we can ever feed a program down the road every theoretical variable. "Gut feel" is often wrong because all variables were unknown (or could not be known) or not considered. True, humans toss in emotional and irrational variables that actually "junk up" the gut feel as well, but we can only hope the most tenured individual would be able to overcome that at times. I agree that the systems will get better. Analytical tools will improve, but at best I think we'll reach a theoretical harmony as opposed to a calculated reality. At best the machines could work with probabilities, weighing things out, and leaning towards the more probable outcomes. Often times analysts do the same thing, but then you get that one person whose "gut" tells them to go with the 5% probability and magically you get Warren Buffet or Bill Gates. Do I see it replacing many of the average analysts? Sure, eventually. But I contend that the human element, even in the minority, will always trump shear technology based upon mathematics and science. In Summary, "The Problem is Choice." Ok, couldn't help it...

By Joseph D., Finance & Data Modeling Professional - Wednesday, January 09, 2013

haha... It was "inevitable" that someone was going to throw that out there :) I suppose the best representation of my position is embedded within my response to Tom above: I am in no way anthropomorphizing technology. Often the problem with doing such and labeling technology as having human characteristics is that we are speaking of technology in "human" terms. Human terms are not technical terms in and of themselves. This is the same issues when we look at science entering the realm of religion and other metaphysical areas: Conflict in terms and definitions. My true argument above is that if we are analyzing the statistic success of analysis of information, the technology is only as capable as the engineer that gives those capabilities. I am not supposing AI as a means of acting autonomously: This is falling into the area of conflict in terms. ((Can you tell I was an undergrad philosophy major yet?)) If we are looking at production capabilities of collective mission success, strong algorithms and coding capabilities to analyze the massive amounts of data, layer ontology of geo/demo/ethnographics on top, and make "knowledgeable" assertions on what the information is providing: The sampling error is much greater on the human than the machine. The role of the human is not obsolete by any means, but changed as the creator of the co-creating as a result of data ingest. Instead of seeing the few missions that prevailed as a result of the 5% gut: How many failed as a result? It either leads to a post hoc ergo propter hoc (fallacy) or we should be able to quantify or qualify the "why" of what happened in the situation. Thoughts? Nik

By Nikolas A., Executive Strategist - Wednesday, January 09, 2013

From my experience at Amazon, data supports business decisions!

By Marco Antonio S., Financial Controller at Choose Digital - Thursday, January 10, 2013

Gut Feeling brought humanity where we stand today and will be here forever! But it can use some help… In my opinion Big Data puts some math and technology at services for "gut feeling" and narrows it down in terms of assertivety.

By Carlos G., Managing Director at CSC - Friday, January 11, 2013

Gut feeling is initiated by something. Having data as the initiator just makes the feeling more accurate.

By Stephen O., SEO Specialist at ELEMENTSLOCAL - Monday, January 14, 2013

Similar Discussions

Partnership Opportunities

Reach analytics leaders and
influencers. Contact IE.Analytics
Call +1 415 992 7502
or email: hsturgess@theiegroup.com

Big Data & Analytics

Register FREE for increased access to IE.

or create an IE. Account


Registration is free, quick and easy. You'll gain access to the following features:


  • Discussions & Articles
  • White Papers & Research Reports
  • On-demand Presentations
  • Webcasts & Case Studies
  • 50,000 Industry Experts
  • and much more!

Search Results