Interview with Kris Wheaton of Mercyhurst University: B2B Market Research podcast

Authored bycascade

Episode 79: – Interview with Kris Wheaton, Associate Professor of Intelligence, Mercyhurst University

We cover:

  • The skills that exceptional competitive intelligence professionals need to have.
  • His thoughts on crowdsourcing and social network analysis.

For more free B2B market research and competitive intelligence resources go to: www.cascadeinsights.com/resources

Sign up for podcast updates – here

Modified transcript:

Sean:

Welcome to another episode of the B2B Market Research podcast. In this episode I have with me Kristan Wheaton from Mercyhurst.

Kristan:

My name’s Kris Wheaton. I’m an associate professor of intelligence studies at Mercyhurst University, and I’ve been here for about 11 years. Before that I was in the army for 20 years in a bunch of intelligence and intelligence-related positions. I’m very happy to be here today.

Sean:

Thanks for joining us. One thing, just to give you guys some props right out of the gate, as we were having our little pre-chat before this, is that Mercyhurst in a lot of ways is “the” competitive intelligence program.

Fundamentally, whether you look at the certification stuff or you look at other programs, you guys are seen as the one that does a really comprehensive job of taking an analyst from step zero and walking them up to the point that they could really be productive when they leave the program. That’s whether we’re talking about the bachelor’s or master’s degree thing.

I guess that leads me to, really, my first question, which is, why do you think it’s so hard to create/find these people? Because, to finish off this area, my experience has been when we talk to corporate clients, they really struggle to find good CI people.

What’s your thoughts on why it’s so difficult? What are the characteristics that you’re really looking for, and why is it so hard to get it all into one person?

Kristan:

I think there are a couple of answers to that, or a couple of ways in which to look at that. I think the same reason it’s hard to find a good engineer or a good architect is the same reason it’s difficult to find a good CI person. It requires training. It requires education. It requires some experience with the real world of work. Those kind of guys just aren’t easy to come by.

There aren’t a lot … In fact, I can’t think of a single other applied intelligence studies program. What I mean by that is that we try very hard to expose our students, not to just a bunch of theoretical concepts, but to put them in their seat and make them do the kinds of things that you would expect a CI professional to do when they get onto the job.

We do an awful lot of contractual work here, so the students get exposure through that. Almost all of our courses are project-based. They’re designed to recreate the environment in which you’d be in. There aren’t a lot of programs out there like that, so you don’t have an academic environment that’s educating the next generation of entry-level CI analysts. I think that-

Sean:

I think the thing that I want folks to understand about what you’re saying that relies to a corporate environment is it’s hard to know in advance whether someone really can have the investigative chops, because a lot of people can be either maybe analytical or really good communicators.

Sounds like you guys try to give them that battle experience a little bit while they’re in the program so you guys can either maybe weed it out or understand what they need to know to get better at things.

Kristan:

You’re absolutely right. We oftentimes tell employers, “The most valuable thing we give you is a self-selection process.” About 150 students come in here as freshmen every year; about 25 graduate students. We’ll only graduate about half of that class. On the graduate side it’s a lot higher; it’s closer to 80% of that class, but we won’t graduate everybody. What happens is, is that they do leave. It’s not for them; they don’t like this kind of work; they don’t want to be doing this kind of stuff; or they don’t have the skills necessary to be successful in this discipline.

By the time they’re seniors, or the time they’re second-year graduate students, they actually know they like this stuff, they know they have a core set of skills in it, and they’re ready to make a career out of it. That’s just huge. You’re not hiring blind when you hire somebody like that. You’re hiring somebody who knows what they’re getting into and has the basic entry-level skills they need to be successful.

Communication skills, we teach those. Analytic skills, we teach those. All the skills you mentioned, we have, if not a specific course … it’s integrated into the entire program, because almost all the faculty here, with one exception, have been there, done that. We’ve all been intel professionals at some point in our career, and so we know what’s required.

Sean:

I think that’s another thing to think about if you’re a corporation, which we work with a lot, is you’re going to have 1) a variety of different people you could potentially tap for your new CI team or a new CI analyst, but there’s going to be a flame-out rate.

Kristan:

That’s right.

Sean:

I think in corporations it’s hard to build a role where you plan for that. There’s almost this assessment that “Once I’ve put the team in place, it looks bad if people flame out.” You have to expect that, because some of these guys are not going to be everything you want. You’re going to find out pretty quickly they like to send newsletters and they don’t want to talk to human beings, or they like to go to trade shows and they don’t know how to do analysis.

I’ve told these teams that if they had the latitude, one of the stronger things they could do is actually recognize where these people are strong and build a team that isn’t aligned with, “This is your competitor to track; this is your competitor to track.” It’s more, “You’re really good at human intelligence. You’re going to gather human intelligence for the group. You’re really good at analyzing digital signals. You’re going to go do that.” It’s hard for them to do it, because you don’t have a thing to point to that you’re basically in charge of, like a given competitor.

I always tell them there’s a fallacy in that, because they’re only going to be good, exceptional, at certain elements of the collection process unless they’re a really long-time veteran. What do you think about that, overall, that general idea?

Kristan:

The research is unequivocal on this. Teams do best when they maximize their strengths. If you’re playing to people’s weaknesses, you’re going to have a less effective team. There’s a good bit of research on that.

I don’t know if you’ve read Hackman’s Collaborative Intelligence, but it’s the bible on how intelligence teams ought to be organized and run. He’s pretty clear about that. If you don’t bring the right skills to the team, you’ve got to maximize the skills that you have. So I don’t disagree with you at all right there.

You’re right about the flame-out rate. We hear that all the time. You hire a political science student in the national security community, and he gets into the CIA and he realizes that he’s not doing political science anymore. He’s got deadlines and he’s doing analysis under significant uncertainty and people are shouting at him. The same kind of thing happens in business. You see guys who are good accountants. They like accounting work, but the uncertainty and the hustle-bustle of the intelligence world is not for them.

Again, that’s why I think a program like ours is as successful as it is. We do a certain amount of that filtering before you hire so that you know when you get here you’ve got somebody who has made a conscious decision to move into that area, to do that kind of work.

Sean:

Let’s put a bow on that. That’s a good closing point. I agree that book’s a good pointer for folks to look at. The full title for folks is Collaborative Intelligence: Using Teams to Solve Hard Problems. That’s available on Amazon and all that stuff.

Let’s talk about methodology as our second broad area before we wrap the podcast, because obviously as a firm in this space … I’ve owned a consultancy for 14 years: the first one we sold and then this one … you’re always pushing methodology. It’s what clients want from you.

What do you see as the “underutilized new things” that people maybe should spend a little more time looking at, whether it’s sources, or analytic methods, or things like that?

Kristan:

From a source standpoint, my research focus is on what I call entrepreneurial intelligence, which is taking a look at what kind of intelligence support entrepreneurs might need. As a result of that, as an offshoot of that, I’ve really gotten involved and interested in the crowd funding phenomenon.

I think there’s a real intelligence source possibility there in terms of … I think that the crowd funding environment itself provides an awful lot of early indicators … potentially a good source of early indicators … about how trends, and certain attitudes, and how markets in certain areas are changing.

I don’t think there’s been a lot of work in that. I don’t think that people have taken that into account. When I talk to some of the corporate clients that we have, they’re always amazed that there’s these things happening in the crowd funding sphere that potentially could be disruptive to their business models, and they haven’t picked up on them yet because they’re not watching them.

It goes beyond social media; it goes beyond all the other stuff. It’s really happening on some of these crowd funding sites. I think there’s some real grist for the analytic mill that hasn’t been messed around with by too many people so far. That would be one place that I think would be worth going.

Sean:

Let’s talk about one other thing… social network analysis.

I interviewed one of the guys that created NodeXL. We talked a lot about that challenge where, if you talk about social network analysis to a room full of people, they will translate that as social media analysis, which isn’t really the same thing-it’s more that one’s a subset of the other one.

You could use social network analysis tools to do more advanced social media analysis, but it’s not really the same thing. I know in the NodeXL interview one of the things the guy said that I thought was awesome …

In his analogy he said, “The challenge with this type of analysis at times is that you can think about things like trade shows or human intelligence and say, ‘I know why I’d go talk to 30 customers; I know why I’d go talk to a bunch of competitors at the trade show. But if 300 people were standing in your parking lot holding up a sign, you would say, “Who are those people, and who do they relate to, and what’s the other group over the hill that they seem to know?” It’s not that visible unless you have the tools and the techniques.

Kristan:

Yeah. I think you’re absolutely right. By the way, thank you very much for your podcast. That was very helpful for a couple of students I have who are doing social network analysis of companies and organizations right now in one of my advanced analytic techniques class. I genuinely appreciate the fact that you guys did that and you were able to track down the guy who did NodeXL. We use NodeXL and like it quite a bit. Thanks for that.

The social network analysis I think … again, two things. First one is, you can’t do it standalone, but it certainly can help you get insights and leads so that you can direct other activities more efficiently. It can be used to do another one of my favorite methodologies. I call it multi-method. That’s where you look at the same target through a lot of different methods. If you’re starting to see the same story pop up over and over and over again, despite the fact you’re using these different methods, you can have a lot more confidence in your analytic result.

Social network analysis is one of those really good ones, because it is mathematically rigorous, that allows you to really get some deep insights in a way that you might not have seen that before. The downside of social network analysis from my standpoint is two-fold. One is that, because it’s mathematically rigorous, you can really only measure a very discrete number of variables, and that’s problematic, because other things matter. The things you’re measuring may not be the things that are mattering in the equation.

The biggest problem I find … This is one you alluded to earlier on … is turning that network diagram, which looks so compelling to every single decision maker I’ve ever shown one to. Just immediately eyeballs go to it. It’s fascinating; oftentimes they’re beautiful; but the question comes out, what does it mean? As soon as they ask that question, you’ve got to have a 25-word answer that really encapsulates what that means. All too often what I hear is people standing up and going, “Well, this is this, and this …” They’re starting to explain the methodology and they lose their audience almost immediately.

It’s very difficult to explain the meaning of those diagrams sometimes in a very short period of time, and that’s a real skill. That communication skill is one of the real challenges I find in social network analysis.

Sean:

Maybe we’ll close on your comments on this. I have a slide in a deck I sometimes put in now that I say, “Be a data scientist; don’t be a data pessimist,” because one problem is the one you mention, where people say, “The explanation was too long. I, in essence, kind of zoned out. I don’t really get the data. Let’s go back to the quotes, the verbatim. Let’s go back to the things that seem a little more transparent.”

The other side is that people sometimes don’t understand the power of the network. They look at certain tools and they forget the massive data that’s behind it. They’re looking for the exception to things as opposed to really understanding the bigger picture, which is why I sometimes do that thing about, “Be the data scientist; don’t be just a data pessimist.”

Any other comments you’ve got on communicating the value of that I think would be interesting for folks to hear.

Kristan:

I think what you have to do is you have to be truthful. It has to be a true image, but you need to simplify an abstract, filter that image to where it’s showing the meaningful things, so that when you are talking, when the words coming out of your mouth saying, “This is important,” that image ought to be fairly obvious, your color and design.

Again, this is almost really a graphic design problem through color and design that that image conveys. “Oh, yeah. I can see the connection between the words coming out of the briefer’s mouth and the image that X is important,” whatever X is.

I guess the communication issue … Increasingly for us here, we’re starting to see the value of teaching design-based thinking and design in general. How do you use colors to show things? How do you …? Those are going to be, I think, one of the skills of the future for all intelligence analysts across the board, is the ability to communicate effectively visually.

Sean:

Yeah. Those are great, great points. With that, I want to thank you for joining the podcast, and I recommend folks take a look at your blog and also look at, obviously, Mercyhurst in general.

To everyone who’s been along for the ride with us on this podcast, thanks for listening and hope to have you along on the next one.

 

 

Home » Blog Posts » Interview with Kris Wheaton of Mercyhurst University: B2B Market Research podcast
Share this entry

Get in Touch

"*" indicates required fields

Name*
Cascade Insights will never share your information with third parties. View our privacy policy.
Hidden
This field is for validation purposes and should be left unchanged.