Data and analytics have become an integral part of triathlon training. Yet, most triathletes don't recognize the true power of their data or realize the incredible results they make possible. In this episode, we cover analytics for triathletes in a way that is highly insightful for both novices and techies. We separate the meaningful from the clutter. We overview advanced analytics technologies such as data mining, predictive analytics, artificial intelligence, and machine learning and discuss how these technologies are changing the way we train and race triathlons.
TriDot Podcast .15:
Training Analytics for Triathletes
This is the TriDot Podcast. TriDot uses your training data and genetic profile combined with predictive analytics and artificial intelligence to optimize your training, giving you better results in less time with fewer injuries. Our podcast is here to educate, inspire, and entertain. We'll talk all things triathlon with expert coaches and special guests. Join the conversation and let's improve together.
Andrew: Welcome to the TriDot Podcast, everyone. Today is going to be really really interesting as we discuss training analytic essentials that every triathlete should use. The more time we all spend in this crazy sport, the more we pick up watches, heart rate monitors, bike, power meters, cadence sensors, and all kinds of gizmos and gadgets that help us record the data of what happened in our training session. Once that data is collected, what do we even do with it? How do we know what to do in future training based on the data showing how previous training has gone? Well, we're going to get really nerdy today people, so I'm glad you've joined us for this. Now, no conversation about analytics and triathlon would be complete without my first guest, TriDot Founder and CEO, Jeff Booher. Jeff is the chief architect behind TriDot’s insight optimization technology that powers TriDot training. He's a multiple Ironman Finisher who has coached dozens of professional triathletes and national champions, as well as hundreds of age groupers to podiums, and PRs since he began coaching triathlon in 2003. Jeff, how pumped are you for today's topic?
Jeff: I'm very pumped, 120 PSI and ready to roll.
Andrew: Oh, that's a nerdy cycling joke. Okay, if you're listening and you're pumping your tire up 120 PSI, you're doing it wrong. Let's just-- that's another podcast though. Well, next up is Coach, John Mayfield. Now, John is a successful Ironman athlete himself. He also leads TriDot's Athletes Services Ambassador and Coaching Programs. He has coached hundreds of athletes ranging from first-timers to Kona qualifiers, and professional triathletes. John has been using TriDot since 2010 and coaching with TriDot since 2012. John, thanks for joining us today.
John: My pleasure.
Andrew: And who am I? I am Andrew, the average triathlete, voice of the people, and captain of the middle of the pack. Now, we'll all get going today with a warm up question, and then we'll get into the main set conversation all about the essential analytics for us as triathletes. For our cool down, we'll be highlighting our Tri Club of the month. Brand new feature, we're going to talk about an awesome Tri Club based out of the Houston area. So, lots of great stuff. Let's get to it.
Time to warm up. Let's get moving.
Andrew: With all of the equipment we use as triathletes, the last thing we expect to have problems with is the kit we wear in training or on race day. But even still, if you have been in a sport long enough, you've probably had a wardrobe malfunction, or an interesting tri kit experience to share. So, Jeff, John, for today's warm up, tell us one of your best adventures and tri kit stories. John, let's start with you.
John: So, to be honest and fair, this whole question was posed because of my recent story that while it was happening, I thought to myself, this is going to be great podcast material. So--
Andrew: That warms my heart so much to know that the podcast is on your mind in the middle of an Ironman.
John: Yep. So, a couple weeks ago, I was racing Ironman, Arizona, and I had a new kit and it was fantastic. Biggest thing was it shaved significant time off my transitions just because I was able to just go right through. In the past, I’d changed kits because I like to be comfortable and fresh and it's hours long each leg. So, this time I went to a one-piece kit. Rocket Science has a fantastic one-piece kit that we've gone to, so I was super excited to wear that for the first time at Ironman, Arizona.
Andrew: You looked great in it, John.
John: Thank you.
Andrew: You looked great.
John: Thank you. I got through the swim great, got through the bike great, and I head out on the run and like my T2 was by far the fastest I've ever had at an Ironman mostly because of this kit that you know, I just had to put on my shoes and I was out. So, I ran the first couple miles, and everything's great with the kit. The only thing was it was starting to get warm. And so it's very important to control your core temperature in racing. Once the core temperature gets too high, your body begins to shut down, it’s very difficult to get your core temperature back down. And when you're running too hot, that's when your body is going to shut down, and that's where a lot of people, their day ends because they overheat. That's one of the very difficult things in racing in hot climates is being able to control temperature. It wasn't a particularly hot day, but it was warm enough where I could feel my body temperature was coming up. It was in the desert, it was a sunny day, and so I started my cooling protocol. And so I was dumping water over myself to keep my temperature down-- [crosstalk]
Andrew: Cooling protocol’s a real fancy way to say dumping water on yourself, isn’t it?
John: Well, they’re multiple things and one of those is ice. And so ice in the palms, ice under my cap, and I put ice down the back of my jersey. And historically, I've always worn two-piece kits and that ice will kind of begin to melt and roll down the back and feels great as it goes down. And then it falls out the bottom of the jersey. Well. Again, this is the first time I've worn a one-piece kit so-- [crosstalk]
Andrew: So, where did it fell to, John?
John: Well, it didn't fall. It just kind of settled down in the nether regions. And so I'm running and I can envision exactly where I was on the Ironman Arizona run course, it's down there by near the dam on the kind of where the swim exit was this year, as I was-- [crosstalk]
Andrew: I can envision where you were.
John: ...running right there. So, I want to say it was around mile five and I’d dumped this ice down the back of my jersey and it's made its way down my back and-- [crosstalk]
Andrew: As per ush.
John: Yeah, settled in down there in the nether regions and it's quite uncomfortable, it's kind of awkward but it's working. You know that actually is one of the places that you can apply ice and cold water to help keep your temperature down. Think about you know, wading into a cool pool, it's hard and that's because of-- [crosstalk]
Andrew: That's one of the more difficult places to get that cold water to.
John: Yeah. And that’s because of that. So, anyways, I'm running along, I've got ice in my butt crack, and it’s all in there and I'm thinking man, this is-- And there's nothing I can do about it. I can't like you know, it's a one-piece kit, so I can just-- it's there. I just have to wait for it to melt. And so I just kept running and I was thinking man this is going to be, I'm laughing to myself like this is funny, and this is going to be hopefully some quality podcast material. So, there it is. I hope you know, I hope that that doesn't disappoint. But yeah, so that was my thought back at mile five of Ironman, Arizona last year. I hope that that is enjoyed.
Andrew: Oh, it was well received, John.
Andrew: It was well received. What it makes me think of is the show, FRIENDS, right? The classic TV show, they would name every episode by something that happened in that episode.
Jeff: They had the one about [crosstalk]
Andrew: Yeah, so if Joey got a promotion in an episode, it would be the one where Joey got a promotion, like they would just name episodes very, very easily like that. This podcast episode, if we did that, would be named the one where John got ice down his butt crack, and that's a crowd-pleaser. That's just absolutely a crowd-pleaser.
John: Good. And I think we should.
Andrew: Yeah, great way to start-- [crosstalk]
John: I’ll be disappointed if the title’s not that.
Andrew: All right. Well, prepared to be disappointed. All right. Moving on to Jeff Booher.
Jeff: Hate to go after that one. That's kind of hard to anticlimactic.
Andrew: I mean, we could just transition to the main set after that, but uh--
Jeff: Well I do have something that's kind of humorous and it's very different. So, it's a different category. So, this one’s something that occurred not actually wearing a kit. But we have you know, our pro team was about 10 years ago. This is ordering with different vendor, not Rocket Science. So, a lot of those kits are made over in China and there's a language barrier and all this stuff. So, we're having, you know, they're pros, they have the name on the kits. And so we have three identical kits with three different names on them. So, we send it over there going through this-- just a headache of all the, you know, getting if anyone's ordered kits, custom kits, you know how that can be just a headache. And so we finally get them. Wait weeks and weeks and weeks, the race is coming up. We get the kits back and they said get the whole design ready and just put on there the names of the people, the last names of the people you want on there, so great. We get it back, there's one kit and across where the name goes it says Montez, Han, Kershav with commas in between. So, we had three names, little bitty letters across so that all three names could fit across the chest and the butt.
Andrew: So, you got one jersey instead of?
Jeff: One one-piece kit with three names and commas in between them going across the chest and the rear end.
Andrew: And they didn't realize that it was supposed to be on separate kits.
Jeff: Three different people-- [crosstalk]
Andrew: And did it for separate athletes.
Jeff: So, I thought that was kind of humorous, yeah. Hadn't happened since then though.
Andrew: Thankfully, thankfully. Yeah. I-- but though the one I was going to share, I'll try to keep it brief because John's took a while and it was for a good reason, for a very good reason. But I was doing a half Ironman and I was wearing a DeSoto Tri Top. I've always enjoyed their stuff. And it was a full zip tri top and this particular race, we were traveling and while we were traveling, you know, you're in a different country trying different cuisines. And I don’t know, I think it was just two weeks of eating food my body wasn't used to but I had more porta-potty stops in this overseas half Ironman than any other. And on my third porta-potty stop on the run course. I-- just-- you're in a porta-potty, it's humid, you're dehydrated, you're trying to zip that jersey back up. Well, the zipper jammed really, really bad right at the bottom of the jersey. And I tried, I probably spent 60 seconds just trying to get that thing unjammed and trying to get it zipped up. And so I just spent the whole probably last six-seven miles of the run course with just the jersey totally done just running down the run course. And so my finish line pictures, me coming across the finish line, jersey just flailing open, looking like a hot mess, dehydrated, tongue sticking out. And I proudly post that to social media, of course, because I survived the Half Ironman but my jersey did not.
Jeff: Yep, so you were bare-chested. So, this is very wardrobe malfunction, you and Janet Jackson, you have something in common.
Andrew: I was the Janet Jackson on the course, that’s exactly right.
John: Fortunately, you were in Europe and no one even noticed.
Andrew: Fortunately, I was in Europe and no one knew me. That's great. That's perfect.
On to the main set. Going in 3, 2, 1.
Andrew: Our main set today is brought to you by our friends at Garmin in the fitness and multi-sport market. Garmin products are the gold standard, known for their compelling design, superior quality and best value. As a triathlete, Garmin can be and should be your very best friend. They offer best in class GPS watches that can track your every swim, bike and run with ease. When you are out on the bike Garmin’s Vector Power Pedals can measure those all-important watts, while their edge cycling computers conveniently display all your data in real time as you ride. You can also bring Garmin into your pain cave with their tax indoor trainers and accessories. I tell everyone who will listen that my tax flux indoor smart trainer is the best investment I have made in my own triathlon training. The best part is, Garmin is fully integrated with TriDot. So, your Garmin Connect and Garmin health data seamlessly streams to TriDot and your training is continually optimized. So, head to Garmin.com and check out all the cool tech they have to offer.
All across the Wide World of Sports, more and more teams, coaches and programs are leveraging data and analytics to improve athletic performance training. The multi-sport world has caught on and analytics and data-driven training is all the rage these days. And for good reason. There is no doubt that working smarter and not just harder can go a long way for your triathlon training. So, we buy watches and monitors to record quality data. But then what do we do with that? How do we analyze all the data that we are collecting in our training? Now, we're not going to dig into individual training metrics today, but we're going to focus on the actual technologies that are used in the decision making with data. So, today, I've brought in Jeff and John to walk us as athletes through the field of analytic training to learn what is essential for us to know about data-driven training. So, guys, let's start kind of just at the basics, let's kind of establish some stuff because we were actually right before we started recording this, we were kind of talking about this episode. We were talking about what athletes need to know when it comes to data and analytics. And I think for most athletes out there, a lot of these terms just kind of start blending together. And we know, data is important. We know analytics is important, we see that in other sports as well. So, John, talk to me about kind of what you see out there that athletes need to know when it comes to data analytics.
John: So, the most common thing that we see are accumulation metrics. So, this can fall under a lot of different names, miles, hours, yards, that sort of thing with the common logic that more is better. So, oftentimes, this is what the athletes are looking at. This is what-- this is their data that they're quote on quote, “analyzing” and with the sense that the more they do, the better off they're going to be, the more prepared they're going to be for a race. The more training they do, the more hours, the more miles they log; this is what they compare one to another, which is something that's pretty common to see is I logged X miles this week, I ran this many miles last month, I did this many hours of training. And that gives somewhat of a common ground for one athlete to compare themselves to another, which is not particularly beneficial or even useful.
Jeff: I think it’s kind of psychological. I mean, it's an accumulation metric, but it's not a logical, this accumulated whatever it is; fitness hours, miles is not leading to better training necessarily, maybe, maybe not. But it's a sense of accomplishment, or just an emotional bragging right, a satisfaction-- [crosstalk]
Andrew: And we got this year 2019 and every athlete on Strava, Strava emailed them and sent them this link to, “Hey, here's all your stats on Strava.” And everybody was posting almost bragging rights of “Oh, look at all the miles I logged last year.” It's like well, what did you do in those miles?
John: And those are fun. They’re neat to look at. And it's amazing to go back in a year and look at how many miles you rode. And it's the equivalent of riding from here to somewhere really far away. And so that's what I was saying is that's very common is to use these accumulation metrics to gauge, but it does nothing to speak to the quality of training. So, as heavily as the space relies on these accumulation metrics, it really neglects quality. So, it's more so a quantity over quality when the vast majority of athletes are looking at data and trying to use analytics.
Andrew: Yeah, John, I definitely see the same thing that that's typically what athletes are looking at, right. They're doing their workouts and they're kind of after the workout looking at oh, here's how many miles I did. Here's how many yards I swam. Cool, check that box off, got that workout done, and then move on to the next one. But there's a lot more to analyzing our data and analyzing our workouts and prescribing training based off of that than just the miles and yards and what we're doing. So, we're going to dive a little bit deeper than that today. And Jeff, I'm really going to lean on your background in analytics as we start here. Now, you've been working on analytics and technology for 30 years now. So, before we kind of dig in here, I'd love to kind of hear from you why is it important for athletes to have a working knowledge or an understanding of the analytics that are applied to their training?
Jeff: Well, Andrew, there's a lot of words, technology’s coming out, the people we see around us, predictive analytics, advanced analytics, AI, machine learning, big data.
Andrew: Those are like buzzwords now, right, for companies
Jeff: Yeah, you see them everywhere. Regular commercials, I mean, they're on your phone, they're on Google, they're on-- [crosstalk]
Andrew: We use artificial intelligence.
Jeff: Yeah, all these different things. So, when you're looking at that, I think in training, you don't see it. You’re starting to see it a little bit. We've been doing this again for you know, more than 10 years. But you're starting to see it a little bit more and understanding in all these different areas since I think a lot of people at first are intimidated, or they may not understand or know what that means. But I’d kind of point back to the 90s, and the PC revolution and then the internet in the early 2000s and then smartphones. And if you think back during all of those times, these new terms, these new words started coming out, and you're trying to differentiate, and everyone's claiming using buzzwords to sell their stuff. And so I think it's not something that is coming, it's something that's already here, it's already around us. And so I like the term that you use, a working knowledge. So, you don't have to know how a cell phone works, satellite technology, all of those things, but you need to have enough knowledge that you know, is this good reception? Is it-- [crosstalk]
Andrew: To use it properly.
Jeff: Yeah, to use it to make the decision, which one do you buy? What kind of plan do you got? You know, just some basics. And so, I think that's the best purpose in looking at our podcast today is how do you use analytics? What are they used for? And what are some of those terms when we're having the discussion? And I think the first thing is to consider the purpose of analytics. And so that might seem a pretty basic question, I’d even asked listeners pause, what's the purpose of analytics? Why do you look at the data?
Andrew: We collect all the data? It's there for us on Garmin or Strava, or whatever platform-- [crosstalk]
Jeff: So, why do you look at it? Why do you look at it?
Andrew: To know what I did.
Jeff: So, that's great. We'll kind of come back to that a little later. Because that is, it's looking backwards, what you did. You're not making any decisions based on that. The purpose of data is to be able to analyze it, and there has to be certain qualities of the data. Then when you do the analytics, the purpose of the analytics is to gain insight to make decisions. That’s our insight optimization engine. That's what we've been doing for 15 years, is creating the technology to make that data and the analytics possible to make future decisions. So, kind of back through that, the technology is now in place we’re able to make the decisions. So, the functional knowledge, the working knowledge, if you will, familiarization with that athletes need to have, they need to understand some of these terms we’ll go through. But in order not necessarily to perform the technology, they're certainly not going to code the technology, but they need to understand it enough to make decisions about their own training. So, if you have technology deciding things or coach deciding things you're training, or you're deciding what your training is doing, there's a-- I love this saying, it was kind of used in business that it's not-- the most important thing is not who decides, but it's who decides who decides, kind of with your president everything. There's a lot of people in big roles of responsibility making big decisions, but the ultimate decision maker is the one who decides who gets to decide.
Andrew: So, us, as athletes, we know we're clicking this data on our training, and we know that there are people more knowledgeable than us that can help us use that data to guide our training. So, we need to be making a good choice on who we're allowing to-- [crosstalk]
Jeff: Right. And so you don't have to be an expert because you don't need to make the decisions, but you need to decide who's going to make the decision. So, you need to decide who's going to decide. Is that a coach? Is it you? Is it technology? Is it technology and a coach? You know, what is that mix that's going to produce the best outcomes for you? So, if you do that, if you make that right decision, it's a huge and probably the most important decision that a triathlete makes. It's going to determine how you spend your time, how you progress or don't. Do you plateau? What's your injury risk? Do you enable or do you limit your potential? It's all based on who you are and/or what, or how are your decisions going to be made with your training using this data? Everybody loves to cling to the data, look at the data, watch the data. But are they really using that to drive their training?
Andrew: Yeah, that's great perspective. I mean, every athlete listening to this podcast, they're going to be the ones that decides who decides what training they'll do. So, there couldn't be a bigger decision than this for us as triathletes. So, we definitely want to make an educated and informed decision.
John: So, this conversation is going to be beneficial for those new to analytics and optimize training, but those that are also currently using TriDot and optimize training. It gives understanding and confidence and knowing the data science behind the decision making. So, one thing we often say is trust the process. And we're not asking for blind trust, or just have faith in this software that’s creating training, but we want to provide insight, and share and educate this is why we say trust the process because everything is based on data and analytics, and it's not based on theories or philosophies or things we've just made up. It's actually based in analysis of massive amounts of data.
Andrew: Yeah, and actually, I mean, this topic isn't just important related to triathlon training, I mean, kind of taking the mindset of having a working knowledge of how certain things are would help our listeners better understand other similar technologies and other fields or industries. So, John, as a coach and an athlete for more than 10 years, I mean, you must have seen a lot of change in the triathlon space when it comes to data and analytics. In recent years, I've been seeing the terms data-driven training, analytics, AI more and more. Now, to me, this implies that data is driving the training, design and decision making. Is that accurate?
John: In some cases, yes, and in some cases, no. So, as you mentioned, a lot has changed in recent years. But also I see out there a lot of stuff that hasn't changed, that a lot of athletes and a lot of coaches are still using very, very basic level decision making. They think they're leveraging data to drive the training, but in reality, they're not. So, they actually are looking at data, they're looking at what we refer to as descriptive data. So, this is all the data that's past tense. This is the log and track stuff. So, this is pace, heart rate, power, miles, hours; all these things that the devices are capturing, these are the things that the athletes use to execute their training. And then they go back after the fact and look at these descriptive analytics after the fact. And they are a past tense, rearward looking view of what the athlete has done. They do a great job of detailing the session of what was done.
Andrew: This is what you did.
John: Exactly. And that's it. It does nothing to say what now. There's nothing in that historical information that tells the athlete what to do next. So, based on this, so even though the coaches and athletes are going back and looking at this historical data, it still is up to another something other than the data to decide what happens next. So, it's either an individual philosophy or theory or experience or trial and error. It's kind of more of the same. So, even though they're looking at the data, maybe they're trying to find some insights in there. There's a lot of software out there that digs really deep into what happened historically. And that's kind of what they often will rely on is look in the past, and then let's look really close and get really granular on what happened in the past. But even, you can keep looking backwards and looking on a very fine level, really diving into that historical data; it’s not going to tell you what to do next. It's like driving by looking in your rearview mirror. You can have a fantastic view of what's behind you, but that doesn't tell you what to do next, it doesn't tell you what's ahead. And that's really the difference in optimize training is using data to then determine what happens next. And that's only achieved through these advanced analytics.
Jeff: So, you use the term advanced analytics, that's a broader category, there's really-- there's a bunch of categories, but the main three are descriptive, and then you have predictive, and prescriptive. And so that descriptive is what everything else out there is, it's looking backwards. And then the advanced analytics is kind of everything forward, including predictive and then prescriptive. So, predictive is predicting an outcome in the future and then you make a decision on that and you prescribe something for a particular use case or a particular person in a certain circumstance. And so you know, that recognizing what's driving your training involves two things; the type of data or analytics being used, and then the decision making process itself. So, what is making decisions? Technology, is it a person, are they applying logs? You can look at a lot of the other software out there. I’ll throw a couple names, Final Surge, Training Peaks, there's a number and look in their log, their log track your training, log your training.
Andrew: So, just another way to record what’s [crosstalk] just happened.
Jeff: Correct. Garmin Connect, there's a ton they record and show you like your, you know, Strava. Here's what you did last year. Nike or is it Nike-- [crosstalk]
Andrew: Nike has one, yeah
Jeff: Garmin has one that says beat tomorrow. That's their slogan is beat it, but didn’t tell you how to beat it. Like it showed you what today was and then it wants to inspire you.
Andrew: Beat it as in be faster? Beat it as in go farther? Beat it as in--
Jeff: Yeah, do better. And so that's what that technology that-- the analytics used to prescribe the future training is used for.
Andrew: I think a lot of athletes though, they think that's what the art of coaching is. They think that it's the coach's role to kind of help them decide what to do next.
Jeff: Well, there's a couple things there. One is that if the data were driving the training and a coach, if you have multiple expert coaches, all looking on those platforms looking at your historical data; if the data is driving it, all of them should prescribe the same thing for you to do next, right?
Andrew: Because they should all know what's best for me based off of what I just did.
Jeff: Well, you just said they should know. But what I'm saying is, is not they should know. It's if the data is driving the training, and coaches look at it and they're driven by data, then all of the coaches should prescribe the same training.
Andrew: Because they should all know what the best training is for me as their athlete.
Jeff: But you're again going to-- they should know. So, you're applying their logic has proven my point there. If the data is driving the training, then you could take your training plan, all your historical data, all of your biographical information, whatever, not biographical, but biometric information and the coaches should all say the same thing. If they're not, then that means the data is not driving the training, they are driving the training; their judgment, their philosophy, or like you said their art.
John: So, the data is the same in that scenario, so they're all looking at the same data. The difference is their own approach; their education, their experience, their philosophy. Because the training plans are going to be different and chances are, they're going to be vastly different and only one of those can be the best. And that doesn't even necessarily equate to what is optimal. So, that's where we're utilizing these analytics to remove that, and truly look at data and be able to leverage these things like predictive analytics to see if an athlete does X, what is the expected outcome of that, and then we can compare. And then we get into these prescriptive analytics that prescribe specific actions to achieve that.
Andrew: So, what's fun as an athlete is kind of like coming into this talk. I mean, again, we see in sports, and I said this earlier, we see the word analytics just kind of thrown around, right? I mean, the NFL teams are using advanced analytics to pick their next player, we've seen those terms. And I mean, Jeff, a moment ago, you said, “Oh there’s predictive analytics, and there's advanced analytics, and there's this.” And it's like, “Oh, there's different types of analytics? And they all have-- they serve different purposes?” Like good golly, like, so many people could claim that they're using analytics, and you don't even know what type of analytics they're using.
Jeff: That's exactly the point. What I want to highlight is you just hear those words, oh, wow. And then you trust it. You make the decision and you put your trust, your future, your aspirations, your injury, your satisfaction, your goals, your PRs in the hands of something that's using the word analytics.
Andrew: Yeah. So, let me move to this, we're talking about coaching. How is analytics technology better suited to make all these little training decisions than a coach is? I mean coaches have traditionally been the sole main decision makers when it comes to training design, right? So, how-- Again, talking about us as athletes, it's our responsibility to pick who we're using to lead us in the right direction. Why is analytics technology a better pick to guide us than a coach?
John: I think it's for those reasons that we discussed. Coaches play a vital role in triathlon training and in the triathlon space. We are triathlon coaches. It's what we're passionate about, it's what we love to do, it's our job. We love working with athletes and fostering through their triathlon experience. But as valuable as the coaches, they're valuable in certain aspects and there are better ways to do other things. There's a lot of examples of this in other spaces of things like medicine, where as critical as physicians are to our health, they have leveraged technology to do things that they used to do. And it's not a replacement, it's just an enhancement. So, what we're able to do in leveraging software and these advanced analytics in creating optimized training is one, we are able to produce the best training plan possible for the athlete. So, we're able to prioritize and emphasize the athlete’s success, their enjoyment, their overall health, better results, less time, fewer injuries is what we're able to deliver to the athletes. So, now the athlete has a better triathlon experience. And then the coach is able to-- [crosstalk]
Jeff: Yeah, we're really like decoupling just the training design from a coach. It used to be only a coach could design your training. If you want a good training program only a coach could do that. But now, those are separated. A technology can design the training, but there's still a whole lot of other things that you know, the coach is more valuable.
Andrew: So, the coach is still valuable?
John: So, actually what it does is actually enhances the role and the contribution of the coach. Because going in and looking at every single session that an athlete does, there's a tremendous amount of data that is created. What most people don't even realize is that for every metric that is tracked in a session, devices track that on a per second basis. So, take, for example, a run session; you have your time, you have pace, distance, cadence, elevation gain, and there's many more than that. And each one of those metrics is tracked on a per second basis. So, there's actually 3,600 lines of data per hour generated in the training. [crosstalk]
Andrew: So, when I let up on my heart interval to pet that dog going by me on the sidewalk, and I think that I'm hiding it by speeding up a little bit-- [crosstalk]
John: Every single that is going to be reflected within that data file. And the truth is, there's one, no one has time to go in and really look at the actual data that is created within a session. So, that's why we rely on averages and charts and that sort of thing to kind of simplify the data to make it more digestible. But that doesn't even provide the true insight as to what happened within that session. And that's not the best use of that coach’s time. To go in and analyze every single training session, it's just not feasible.
Jeff: We believe in really letting technology do what technology does best; analyzes and optimize the data, and let the coach do what the coach does best with work with people, work with human beings.
John: So, when the software does all the analysis, when it actually reviews 3,600 plus lines of data for every one-hour training session that is done-- [crosstalk]
Andrew: For every athlete.
John: For every athlete, that frees the coach to do other things. And these are the things that we found over the years that one, the coaches enjoy, and two, the athletes truly value. So, these are the things-- [crosstalk]
Andrew: John, you as a coach, you don't enjoy going second by second on that many data files?
John: Not every day. Just-- you know, and actually yeah, I mean, personally speaking, what I love doing is engaging with athletes, encouraging athletes, mentoring athletes, walking them through the challenges that they face, making sure that the next training session that they have coming up fits into their lifestyle; how do we make adjustments to training? How do we best prepare? How do we recover? How do we prevent injuries and all these things that the coach can make those contributions on in walking the athlete through this journey? That's what I enjoy, that's what the athletes most value, and that's really where the coaches make their largest contribution. When these athletes cross the finish line, and they're doing their race reports, and it's awesome to hear shout outs to the coach; it's not saying, “Hey, coach, thanks so much for all this great analysis. And thanks for this training plan that you put together for me.” It's always, “Thanks for the accountability. Thanks for the motivation. Thanks for all the counseling. Thanks for those tips. Thanks for preparing me for this race.” And so that's what we're able to do is give the athletes those things. So, as Jeff mentioned, the technology and the software creates the training, it adjusts the training, it does the analysis. It does what software does and it allows the coach to do what we as people do. We care, we empathize and provide that level of service back to the athlete.
Jeff: And what we're seeing out there and in the space is just like every other space, what, you know, the good-- There are just like with PCs, everything else, there's gonna be people who don't adapt. You can look at any different industry but those that are morphing; doctors, there's MRI machines. If there's people still trying to diagnose that kind of stuff without them, they're not in business anymore. Those that adapt and use the tools, learn how to use it, morphe their business, create different value, find more time to do different higher value things, those are the ones that are going to be successful and thrive. I think there's-- it's even beyond time but there's just a capacity, not just the time and all the, you know, lines of code and data, but all of the data has to be standardized and normalized and contextualized. You know, even if a coach was able to look through all of those things, do they know that individuals’ genetics, their past, their body composition; all of these different things. And the more the coach does know all of those things, the fewer athletes that coach works with. So, every athlete’s different, and they can't have enough, a population size is not big enough to learn causation even over many many years. Because if they're applying the same philosophy, trial and error techniques, beliefs, whatever, every athlete is different-- [crosstalk]
Andrew: Yeah, because if a coach is working with me and that same coach is working with you know, my dad who is 60 and way taller. I mean, we're different people, we’re different athletes. And so if you apply the same philosophy to me versus my father, we have different training needs.
Jeff: So, either you're working with a ton of athletes and you don't have very much intimacy with any of them, you don't have very much or you work and then you have all the data necessary, a lot of the data necessary, maybe not all of it, but then you don't-- your sample size isn’t big enough. But with technology, technology can have the big sample size, normalize the data, standardize it, contextualize it, and make those decisions very quickly.
Andrew: I remember when I first was thinking about using TriDot myself as an athlete, I was looking for a training program, or coach trying to decide how I wanted to train for my first Ironman. And when I was checking out the website, there's a lot of stats, right, on the homepage that on how much better TriDot training can do for you. And one of the stats that's on there, on the website is that TriDot produces 2.4 times more improvement than training designed and monitored just by a coach. Which is a crazy impressive stat. And you see that you're like, “Wow, really?” So, is it the analytics of TriDot doing its job and a TriDot coach doing their job kind of combined that lead to that 2.4 difference?
Jeff: Well, just to clarify there that it's there's 2.4 times more improvement than training, designed and monitored by a non TriDot coach. So, we took a coach without the technology-- [crosstalk]
Andrew: with whatever plan they have generated.
Jeff: Yeah, but for sure, I mean, we separately, we look, it's like 8.9 times more effective than doing your own training. And then I think, I don't remember what it is, three and a half, something like that times more than buying a coach designed plan, but then not having the coach monitor it ongoing. And then the 2.4 times more effective.
Andrew: All significant numbers.
Jeff: Yeah, yeah, they're all huge. This is a great example, okay. I looked at this it’s a couple years ago, it's a pretty classic use case when grad students, there's a couple of grad students out of Stanford, they were looking to solve a problem. And a lot of the AI, machine learning, deep learning are applied to recognizing images, the speech recognition, facial, and image recognition and natural language. You've seen those in chatbots, and you know, facial recognition, Facebook; all this kind of thing. And so they're creating-- [crosstalk]
Jeff: Snapchat. So, they're looking at those images and trying to create their projects and their case studies, and learn the technology, apply the technology differently. So, this one case that's used quite a bit actually, it's diabetic retinopathy, and it's solving that problem with deep learning so that-- The problem was that they'd have these, they call them fundus images, it's like retinal scans, but it's all the white matter around it. So, it's more than just your retina, it's a bigger picture of your eye. And ophthalmologists are able to look at that and classify it on a scale of zero to four, zero being its normal, fine, four being-- [crosstalk]
Andrew: That picture of the patient's eye?
Jeff: Correct. They're looking at that, all the nerves and the blood vessels, capillaries; all that kind of stuff, white spots, and all. They're looking at and they’re categorizing it in these five different categories. So, and that diabetic retinopathy can lead to blindness. So, it is a very severe thing, but if it's caught early, you can treat it, and they don't go blind. So, this is a very big thing. So, if you're looking at, I don't know what they say, 50% of the world's diabetics are susceptible to this. So, in a lot of third world countries, we could find it, cure it, but we don't have enough ophthalmologists to go around looking at all these images. And so they're solving it with deep learning. And so they create these neural networks, they took these images, there's like a training set they used and it’s like 35,000 images.
Andrew: That’s a lot of eyes.
Jeff: Yep, a ton. So, it takes, that's again, that's a sample size, it’s huge sample size and they know precisely which each one is correctly categorized as. So, then they create this deep learning model that learns and is able to predict what category all of those images should be. It learns on the training set, and then they put it to another set. In this case, it was 15,000 images and says, okay, predict these, these have not been through the learning model, you've never seen these images before. Tell us how, you know, what category they belong to you. But before they did that they created a doctor group, ophthalmologists and took these images and showed them to all the doctors, and so they got and their subjectivity. And so the doctors would-- [crosstalk]
Andrew: And so, they’re trying to decide in these images, how prone is this person to this disease?
Jeff: Yeah. So, this is a normally how it would be done. You take it to the ophthalmologist, they look at this image, they say okay, it's a three, it's a two, it's a four, it's whatever. 65% of the time they were accurate.
Andrew: 65% of the time a trained doctor was accurate.
Jeff: And so there's that subjectivity. And so they're doing something, right. It's not 50/50. That's not a really low thing, they're getting close. But there’s still is margin of error. They're looking at something that's complex, but it's not more complex than your training and your whole history and all of the stuff that a coach looks at to make decisions. And they're only making one decision; is it a 0, 1, 2, 3 or four? They're making one decision.
Andrew: Is this person fine? Are they at risk? Do they need surgery right away?
Jeff: Right. So, more volume, less volume, more intensity; all of the things that a coach looks at, this is one decision. So, when they train the model, the model learned and it predicted it to a 98% accuracy.
Andrew: Wow. So, way more accurate than the doctors.
Jeff: By far more so it was more accurate. So, you're getting better service, it’s much cheaper. So, they're able to take that technology all over the world. They can just take cameras, get the images, put them through the model, predict whether they need the treatment or not and give it to them.
Andrew: And the doctors and then it is the great distinguishment here is that the AI, in this case, is helping the doctors because instead of the doctors wasting their time looking at all of the eyes trying to decide who needs the surgery; they can just go in and do what they're trained to do and do the surgery.
John: They can go and provide care to those that need it. And kind of what we’re saying with the coaches, it enhances what they're doing and allows them-- [crosstalk]
Andrew: It’s not replacing the doctors, we're not replacing the coaches.
John: It allows them to be more effective.
Andrew: Now John, some athletes are probably familiar with metrics like Training Stress Score, TSS, other things like it that are out there to kind of measure how hard you worked in a given workout. So, the coaches that use this to prescribe training, is this an example of prescriptive or predictive analytics?
John: It’s not. So, TSS is actually a descriptive-analytic. It combines multiple descriptive analytics into one. It's a combination of intensity and time. And what it attempts to do is quantify the amount of work and amount of training stress that was done in a session, and it combines the intensity level with the time. So, it uses threshold as the baseline, so 60 minutes at threshold is equivalent to a TSS of 100. So, if you did a one-hour session at your threshold level, that would be a TSS of 100.
Andrew: That sounds painful.
John: That sounds awful. From there, you can do two hours at 50% and achieve that same 100 TSS score.
Andrew: So, wait, so if I go at 50% of my threshold so like, let's say my-- So, threshold right now, as we're recording this, I think is like 185. So, half of 185 is like 92-93 watts. So, you're telling me that me holding 92-93 watts for two hours is supposed to be as hard on my body as going all out threshold for an hour?
John: Well, they would achieve the same TSS score. So, yeah, you would achieve a 100 TSS in each of those sessions. So, you're 60 minutes at threshold-- [crosstalk]
Andrew: One of those I'm walking away hurting and one of those on walking away fine.
John: Right. So, your TSS is the same in that case. So, there are obviously holes in TSS, and even mentioning threshold, there are issues with that. So, again, it's a descriptive analytics, it's available to look at, it's a estimate or kind of a round quantification somewhat of it. But again, yeah, we understand the 60 minutes of threshold is very different than two hours at 50% of threshold or whatever the combination thereof. They're different energy systems, they’re different recovery demands, different training adaptations. So, no, that is not what we're talking about when we're discussing advanced analytics.
Jeff: He mentioned the accumulation metrics, the very first, the very first time, miles, meters, whatever, this is one step ahead of that is there's some element of including intensity, but it's not very accurate.
Andrew: And so if you're a coached athlete out there and that is the leading metric that your coach is using to help kind of guide where your training should go next, then your whole training is based off of a metric that is somewhat flawed, right?
John: Well, it goes back to what we've been discussing is that for majority of these coaches and athletes, they're relying on their own theory or philosophy to prescribe training. So, the amount of TSS in a given session, or these other metrics that are derived from TSS, are derived from that theory that from that philosophy, it's how much TSS is necessary for a given session. And that's going to be different as we discussed earlier. Every coach is going to have a different answer because it's not the data that's determining that, it’s the coach or there are guidelines that you can go in and say, a sprint level athlete should have a TSS of this or an Ironman level athlete should have a TSS above that. But again, they're very vague, they're not individualized-- [crosstalk]
Andrew: And then who decided that?
John: Not the data. So, yeah, I mean, there were-- [crosstalk]
Jeff: Yeah, you can just look at groups of people and see, okay, here's about the averages and create some guardrails. So, here's the range, somewhere within this is a good guideline, so keep it within here. And the TSS is a measure of one session, you aggregate that into longer periods of time, a chronic training load, and an acute training load being smaller. And so it's kind of getting beyond just a session to multiple sessions, and how much is that training accumulated during a week or a month or six weeks?
John: So, I mean, obviously, there was a lot of time and thought that went into this. There were pioneers in our sport that paved the road to lead us to where we are today that allowed us to create data and analyze data and leverage software to do it. And yeah, it was great. It was a great tool coming up. But now just like in so many other aspects of our lives, technology and software has provided new insight for an even better way. So, just because it was good in the past or the best thing in the past, that doesn't mean that it's still particularly relevant. Technology has a way of disrupting what we do and how we do it, and that's exactly what's happening here in this case.
Andrew: Another popular one that I think a lot of athletes hold on to is the 80/20 rule. Is the 80/20 rule also kind of a-- similar to that?
John: Well, it's not a rule at all. It's another philosophy, just as we've discussed. And yeah, I mean, there's a certain amount of relevance to it that it is important to polarize training. And you could even look at a TriDot athlete’s training from a very high level and it may loosely resemble 80/20 of 20% intensity and 80% easier type stuff. But if it were exactly 80/20, that'd be just complete coincidence. But even then, how do you break down the 20? So, it's 80% easy, 20% higher intensity, but what is the higher intensity? How do you determine it's-- because it's not all one high intensity. High intensity isn't a thing. It's a lot of things. It's zone three, zone four, five, six, it’s threshold, muscular, neural stress. It doesn't tell us exactly how much of those is appropriate. That's going to be different for every athlete based on all of their biometrics, their age, weight, gender, their experience, their race, distance, their environment, all these things are-- You can't-- it's not as simple as a nice round number. Just unless it's complete coincidence for that one individual out there, it's going to be right for them. But for the rest of us, if you did exactly 80/20, that is not going to be your best training plan. If you're coming at it from a very beginner, very low level, yes, 80/20 is-- [crosstalk]
Andrew: It's a good philosophy, but it's a philosophy. It's not-- [crosstalk]
Jeff: You have someone that's training 20-25 hours, it may look 80/20. But if someone's training six hours a week, eight hours a week, it’s certainly not 80/20, you know. So, that's very contextual to the athlete, to the age, to a whole bunch of different things.
John: We've discussed it on previous podcast in executing your training. And when you have to miss certain sessions or rearrange certain sessions, the most important sessions in any given week are those that include the intensity, what we refer to as the quality sessions. So, if you've got four hours to train a week, again, you don't want to apply the 80/20 rule. You need to be higher than that. I don't know exactly what it might be. Maybe it's 50/50, maybe it's 60/40. But it's not going to be 80/20. The less time you have, the more important getting in that quality is. It's more important to get in the quote-unquote “20” than the 80. So, again, as the amount of hours are reduced, that secondary number is going to be higher.
Jeff: We could spend a whole podcast on this one.
Andrew: And I guess we will, the more we talk and the more we figure out these are important things. But here's kind of what I'm gathering just listening to talking about kind of the holes in some of these philosophies. They're good training philosophies but at their core, they're not a great way to prescribe training. It sounds like they're missing the component of data. You know, they're not taking into account and account each athlete’s individual data files. So, Jeff, tell me a little bit more about what makes data so important to analyzing and prescribing training?
Jeff: Well, I think you just said it, those-- the TSS, the 80/20. But it provides some structure, some guide rails-- guardrails for using your philosophy, your trial and error to prescribe training. The reason that the data is important is it has to be one, standardized, normalized, contextualized for the athlete. So, those are kind of big words, but if you don't do them, then you can't create cause and effect relationship. If you apply that 80/20 rule to an old person, young person, a heavy person, someone who has been doing this sport 20 years, not 20 years and you have these outcomes; you can't attribute anything, any of your decisions to particular decisions.
Andrew: All those people are just far too different.
Jeff: Correct. And so you don't know. So, you just keep applying and keeping it in this so-called safe area. And you never really know, you don't know what potential was. So, kind of going back to the, you know, that whole diabetic retinopathy; when you have well, here's what decisions are made with technology, here's what decisions are made just subjectively, but by experts trying to do their best, certainly not untrained, it’s better but there's just this whole nother level. But the requirements of getting the data to appoint using technology is critical. And I think John did a great job right there with the TSS and 80/20; throwing in different elements of data to where it just becomes absurd. Clearly, 50% of your threshold for two hours is not the same as an hour at threshold. So, if you're trying to accumulate that metric, it's going to lead to bad decisions. So, that's what we've been working on for the past 10 years, even before we started applying advanced analytics, just standardizing the TriDots, the swim, bike, run. How do you get to a standard threshold score and a value and an absolute and relative value where you can compare a threshold, and one to the other, look at predicting how much increase is possible for a person? Where is the potential for improvement? How do you allocate the time and comparing those normalization of weather, of a whole bunch of other things?
You know, if you do your threshold, whatever your threshold ability is at three in the afternoon outside or early in the morning, it's very different. Your threshold changes, it's fluid, it's not static, but TSS looks at it static. So, in one area, you're overtraining or you're underscoring, based on just strictly the time of day. Contextualization involves having data when you're making those decisions, your sport age, again, your body composition, genetics, workload shaping; how do you shape that during the week? What is the frequency? Is it all of that TSS spread out, is it even? Is it really driven from high intensity and a lot of low intensity or high volume? So, all of those things matter and are going to lead to very, very different outcomes. But inside of an 80/20 or TSS, you know, acute, chronic training load, they're indistinguishable. You can have a whole bunch of different things going on, and they all fall within those parameters. But when you get data involved and you're looking at, you're going to get very varied outcome, different outcomes, and then that's where you can start determining the cause and effect relationships between them. So, it's just recognizing that the decision making models and the actual, the where that metric comes from. So, looking at how do you define those things? What metrics are you looking at? And then how are they used in the decision making, kind of going back to the very first points we're making.
Andrew: When you hear all those different metrics and you hear all the analyzing that's being done to them, and it's just such a reminder going back to the doctor example. I mean, it would be near impossible for a triathlon coach to consider all of those data points for every individual athlete to make great decisions, the right decisions forecasting an athlete's future training. But it's not very hard for the computer algorithms.
Jeff: That’s what it was made for.
Andrew: And the technology to do that. So, when an athlete is looking for a program or a coach that is claiming to use these terms, and claiming to use data and analytics; how can we sort through the noise and kind of know, John, who's doing it right, who's doing well, and who's just throwing those words around?
John: So, like a lot of things, it comes back to what's behind the claim is, you know, for years, we've seen things like custom training plan, personalized, dynamic, adaptive; all these things, but what is it that is making it individualized or customized or adaptive? Is it a person that's adapting it? Is it a set of basic rules that are adapting that training? Or is it as we've discussed as TriDots’ analytics, leveraging this big data technology to make those changes and refinements to the training? So, it really takes a little bit more investigation to find out exactly what it is that is powering it. So, if it is an artificial intelligence-based, what is the artificial intelligence? What is that-- What is determining-- what is making those decisions as we decide--
Andrew: What data points are they analyzing?
John: Right. Or is it even data that is driving the artificial intelligence because that's not necessarily the case. I mean, artificial intelligence can be as simple as a set of rules that are applied by the technology. It's not necessarily leveraging data or training files to make those decisions.
Andrew: It reminds me a little bit of when you call customer service at a company, and you get the AI artificial intelligence voice, right, that talks to you. And we all know, like, good golly, those things usually, unless you have a very basic problem; those things usually can't solve your problem. You have to press enough buttons to finally get to the human operator who can help solve your problem. But for those companies, they can say, “Oh, yeah, we use artificial intelligence.”
John: Right. And it does a job and that's kind of what I mentioned is when you have those very basic questions, it's applying a simple set of rules. It receives a question and it gives an answer. So, again, it's not that software that’s creating that answer, it's simply applying a rule that was already set forth. It was told what the answer was to the question prior to the question being asked. So, that is an example of artificial intelligence, but that's not the type of advanced analytics that we've been discussing here.
Andrew: So, Jeff, when we're looking at these terms, just beyond anything computer-generated in our training, getting into analytics; what should we be looking for when it comes to artificial intelligence?
Jeff: There's a lot of different technologies and they're all applied, employed differently. One as a first must be a ton of data to do all these, a ton of data like that retinopathy, those fundus images were-- used 35,000 to train on. So, one of the ways you can tell is learning of the company's technology, their data set. We've been doing this for 15 years, accumulating data for 15 years. And it can't be noisy data. It has to be clean data that's been normalized and contextualized and standardized. So, this stuff that we did for the first 10 years before we even applied, employed these big data, advanced analytics has to be done first. So, there's no one that's going to come on the market soon and just start doing this. There's a whole lot of work that has to be done ahead of time in our space to do that. So, some of the things include the Big Data one, a bunch of data, data mining, different techniques to get to correlations and causation. And to determine which is something correlated or is there a cause here. Machine learning is using machines to compare, to look at an image or a set of unknowns of structured data, and to say, here's the known outcome or decision that should come from this. And training it over time, and then letting it learn and then putting new things it hasn't seen before and let it apply that learning to those new things, and then those new circumstances, the new structured data, and letting it learn the deep learning, might have heard that term is more unstructured data. So, it's a lot more the neural networks and associated matrices that are looking and they're actually learning. So, it's not even structured data, so they don't know what they're looking for. And that would be kind of the example of the gender from the fundus image. They don't-- no one programmed it to learn that, it learned it on its own.
Andrew: So, that's an example of artificial intelligence that is deep learning.
Jeff: Correct. That's deep learning and that's all under the umbrella of AI. So, all of these are different components, but they're used to solve different problems.
Andrew: Because to an average Joe, we see artificial intelligence and just assume that there might be different degrees, but I didn't know there was a distinguish between machine learning, and deep learning and different types of AI.
Jeff: Yeah. There's a ton, and it's ever-changing with technology. The one company that you mentioned that uses the term AI, how they apply the AI, their use is replacing the coach. They want to replace-- it’s the future of coaching. And so their claim to fame or results is not here's the results we put forth or here is our technology and our data set and our-- it's we have expert coaches.
Andrew: They're leaning on these big-name people.
Jeff: Right and that's the source of their decision making is the person. They're teaching AI to replicate humans flawed decisions; as expert and as world champion or as elite caliber as you want to be, you're still just like the-- What is the-- [crosstalk]
Andrew: The doctor that gets 65% right. [crosstalk]
Jeff: Optimized. Those are trained, those PhD people making 65% accurate decisions. So, I don't care how many PhD level coaches, athletes, elite, world champions are making these subjective decisions; that AI can only be as good as that coach's decisions are.
John: And it effectively becomes like your automated answering service in that it’s-- [crosstalk]
Jeff: It's automating a human is what that application is.
John: Right. It's not data that's determining the answer to your question, it's a pre-programmed answer that is answering the question based on that. So, again, it's what was told, what was taught, what was input already, not based on analytics of data. It's based on again going back to-- [crosstalk]
Jeff: So, that's where-- And we looked at a long time ago getting the word out about TriDot, what are we doing? Do we sponsor, do we find someone that's gonna endorse us? And we chose ethically not to, that we didn't want to go that route. Because how that person became an expert or a pro or whatever is not using TriDot. We have pros that are pros because of TriDot for sure. But it's like I think back in the 80s, the Bowflex and the here's this new technology, this exercise equipment, and this real buff man or woman's on there. Well, they didn't get that way if that just came out, they didn't get that way using that technology. [crosstalk]
Andrew: I always look at like the ab cruncher commercials like a thing that just like the belt that sits on your abs and you're like, okay, the person in that commercial, they have incredible abs, but they didn't get those incredible abs just by sitting there using that belt.
Jeff: Exactly. So, that's one telltale sign. If their credibility is from an expert, a pro, a highly accomplished, that's what they're deriving their outcomes is based on. That's their credibility, that's their source of outcomes where we don't do that. It's the technology We tout, you know the 15 years of data. We tout the technologies that we've employed. We tout the standardization, the TriDot, the training stress profile, environment normalization, the Train X scores, the Race X scores, the PhysiogenomiX.
John: We give away free training to thousands of athletes every year so that we can make these claims. And our claims are based on actual athlete performance. And we give it away for free so that we can have [crosstalk] several thousand athletes, and not one or two anomalies. It's this is the average for thousands of athletes just like you.
Jeff: So, all of those things, I think are telltale signs of how-- what's behind the decision making. So, when athletes are looking to see you know, who decides who decides, they can see the data, they see how long the company has been along the data set, the actual results, the results from athletes, not just a few highlights, anecdotal testimonies, but thousands of athletes. That 2.4 was based on, I think the first we started measuring is based on more than 3,300 athletes. So, that's average results on 3,300 athletes, not the elite accomplishments of one, and their know-how but they’re repeatable-- [crosstalk]
Andrew: This one athlete performs 2.4 times better.
Jeff: Yep. So, all of those is measurable, comparable. Those are all the things that make it really obvious on what that technology is behind the analytics. Because again, back to the basics, analytics; data is about-- Data is produced and generated for the sake of analytics. Analytics purpose is to lead to insight that's used in decision making. So, anywhere you fall short of that, or any breakdown of that process, the decision making is gonna be faulty doesn't matter what the data is if it's not being used in that proper way, yeah.
Andrew: So, we've covered just a ton of ground on just data analytics and their use in our sport, how we're leveraging them towards better training. So, guys to close this out, I’d just like each of you to just respond briefly with just a word of encouragement, a piece of advice for our athletes out there as we continue to see technology and analytics impact more and more our triathlon experience; what would you say to the athletes listening?
Jeff: All right. I’d just encourage and I hope that you know through this podcast, other podcasts getting to know TriDot that we can encourage athletes to one, be discerning, you know, healthy degree of skepticism whenever they're looking at anything and try to understand it. And we always say trust the process, but not blind trust. So, do it and the understanding will come, but seek that understanding if you care to know. But don't be intimidated by technology and realize that new smartphones, PC-- [crosstalk]
Andrew: Don’t be intimidated by this one-hour conversation we've had about data analytics and deep learning.
Jeff: But the goal, again, the goal you put it at the very first is to have a functional working knowledge, a familiarization, enough that you go okay, I get it. I recognize the differences here. I recognize quality, not quantity. I recognize hype, you know, versus quality and you can make that decision. So, I think I'm just excited about the future where we are looking back how much we've accomplished. And I know what we're working on now, and I'm just so excited about, I think we have some exciting times ahead. No pun intended, race times. Anyway, I just look forward to it. And just the contributions of our athletes, absolutely we wouldn't be where we were today were it not for their input, their data, encouragement, participation, feedback.
Andrew: Absolutely. John, what would you say to athletes as a word of encouragement that close us out today?
John: Well, the great thing about this and a lot of technology that we use today is it's incredibly advanced. There's a whole lot that goes into it. And the good news for us consumers is that it's actually very simple for us. We often use the map, GPS mapping example as a parallel to TriDot. And there's an incredible amount of data and analytics and predictive and prescriptive and analytics that go into creating routes that get you from point A to point B in less time. The great thing is no one needs to know how that route is determined. But we all use mapping GPS, oftentimes even just to get someplace we know how to go. If it tells us where the traffic is, it tells us if there's road closures and all that; it tells us where the police are hiding out.
Andrew: Thanks Waze
John: Yeah. So, as advanced as what's going on in the background is, the execution is actually very simple. To execute a TriDot training plan is as simple as looking at a day session, knowing exactly what you need to do and go out and doing it. So, just know that there's a lot going into it, there's a lot of technology, there's a lot of data, there's a lot of software, computer type stuff. But at the end of the day, all you need to do is pull up your daily session, do the work and you'll reap the benefits.
Great set everyone. Let's cool down.
Joanna: Hi guys, Coach Jojo here. My name is Joanna Nami and I am a TriDot coach based in the Houston area. Um, I've been asked to talk a little bit about the team I created a couple years ago, HissyFit Racing. We are a Houston-based women's TriClub.
A little bit first about myself. I am a 14-time Ironman finisher, soon to be 15 at Ironman, Texas. So, hopefully, I will see a lot of you on the course. I will be racing in Kona at the Ironman World Championship in October, so that is super exciting. Feel free to follow my journey on Instagram @coachjotridot. I also post a lot of training tips and advice on there, so feel free to follow me.
I have been using TriDot for about 5 to 6 years for training, and I love it, obviously. I've been coaching for Tri4Him and TriDot for about 6 years. I first came up with the idea of an all-female TriClub about 3 years ago after Ironman, Texas. I was chatting with 3 of my friends who are also co-founders of our team, it's Julie Schultz, Marie Michelson, and Susan Oiler, who have been integral in making our team a success, we talked about how we could come up with an all-female team. And we decided it would be based in Pearland, which again, is just south of Houston.
And I was meeting at the time with tons of women who were new to triathlon, who were looking for advice on how to get started in the sport. I also saw a need for a TriClub that would provide women, at any level in the sport, different opportunities for group training, educational resources. We needed, you know, a resource for social outlet in the sport and overall support when it came to training.
At the time, I had no idea how fast HissyFit would grow. Currently, we are at over 270 members internationally. Most of our members are in Texas, but we have active members all over the country. We actually have some coming in for Ironman, Texas, so that will be exciting.
We… we open HissyFit to all women, whether they are active triathletes, runners, swimmers, cyclists, or just interested in learning more about the sport or supporters. And some of the events that we hold each year are social events around the holidays and during the summer. We have training events, which include weekly group rides, monthly open water swims. We do do swim clinics and group runs almost every weekend.
So, most of these events are Houston, Pearland-based, but we do pick team races, which will be local sprints, and then half and full Ironman races throughout Texas. We do have a group going to Ironman, Florida this year, which is different. So, we are starting to branch out into other States for our races.
HissyFit, gets noticed a lot on the race course for multiple reasons. Of course, we have our awesome, awesome TriKits that we… we wear. You’ll notice us in red, black and white leopard print. And we do say ‘Me-wow’ on the back of our kits, so we get called out quite a bit.
But what makes me even more proud is I think the way that we race. You will often see HissyFit teammates running together, cycling together, and they're always enthusiastic. We often find… you'll see the racers are often very loud and spirited on the course and enthusiastic. We pride ourselves on being very spirited. I was a college cheerleader. And so, I think that comes out in everything that we do.
We always hope to show mainly is the camaraderie and encouragement to all the athletes in the sport. That was kind of our mission statement is, you know, to encourage women everywhere to take that leap of faith and start running or start swimming or start an active lifestyle. Some, you know… it's just some… sometimes people just need a little bit of encouragement. And I know that, overall, triathlon can be very intimidating. It's very self-centered, very… a very, you know, competitive sport, and it can be intimidating. And I wanted HissyFit to be a safe place where women felt they could ask questions and that they could participate and that it wasn’t a competition among women, it was just a place that you could find, you know, support, encouragement, and a place to better yourself.
So, anyone interested can look us up on Facebook @HissyFit Racing. You will find information under the Facebook page that is the TriClub. But you can request to join under HissyFit, The Cat Pack. Or look us up on Instagram @hissyfitracing.
I've heard so many amazing positive testimonies from women who just needed the encouragement and push to get started. And many of them have told me that our group has changed their lives, which is just a dream of mine. I just never would have imagined that it would have that kind of impact. So, you know, feel free to join our Facebook page. Please come race with us and I look forward to seeing you all on the course.
Andrew: Well, that's it for today, folks. I want to personally thank TriDot CEO, Jeff Booher and coach, John Mayfield, for taking us on a journey deep into the core of TriDot. Enjoying the podcast, have any triathlon questions or topics you want to hear us talk about? Email us at email@example.com. And let us know what you're thinking. As your host, I always want to be the voice of the people. And the more I'm hearing from you, the better I can talk about the things you care about. So, again, drop me a line at Podcast@TriDot.com. We'll have a new show coming your way very soon. Until then, happy training.
Outro: Thanks for joining us. Make sure to subscribe and share the TriDot podcast with your triathlon crew. For more great tri-content and community, connect with us on Facebook, YouTube, and Instagram. Ready to optimize your training? Head to TriDot.com and start your free trial today. TriDot, the obvious and automatic choice for triathlon training.