false
Catalog
AOCOPM 2024 Midyear Educational Conference
346719 - Video 10
346719 - Video 10
Back to course
[Please upgrade your browser to play this video content]
Video Transcription
Our next lecturer is Colonel Ribeiro. He is Command Surgeon, Aerospace Medicine Specialist in FAA AME with over 15 years experience in the U.S. Army. He currently serves as the Command Surgeon for the U.S. Army Combat Readiness Center, or should we call it the Safety Center at Fort Novosel, Alabama. He oversees the analysis, training, and development of applications that prevent accidental loss of soldiers, civilians, family members, and vital resources. He has previously held leadership positions at Weed Army Community Hospital, U.S. Army Cadet Command, U.S. Special Operations Command South, Medical Simulations Training Center, and 25th Combat Aviation Brigade. He is board certified in aerospace medicine and has been involved in the COVID-19 response. I'm so sorry. Me too. So without further ado, Colonel Ribeiro is gonna talk about human factors analysis and the classification system. Welcome Colonel Ribeiro. Hi, thank you so much. I'm glad that you clapped at the beginning of this. Maybe you're not gonna clap at the end. I don't know. Again, like Jen said, I'm the Combat Readiness Center Command Surgeon, or formerly known as the Safety Center, and part of the big mission that the Safety Center or the CRC conducts is in terms of mishap investigation. We gather the information, we analyze it, and then turn it into something that the service, the Army can actually use to prevent the next mishap, or to prevent further injuries, particularly to our personnel, and prevent loss of equipment as well. Okay, and I'm here to discuss human factors. As you can read there on the slide, again, approximately 85% of fatalities in the Army are caused by human error. That leaves 15% to the other two factors, which are material and environmental factors. So out of the three, the one that pertains to us at this point is human factors, because again, that's obviously the most significant one for the Army. And I dare to say that would be the same for Air Force, Navy, Marines, in the news lately, Space Command when it starts having mishaps, Coast Guard is also included in there. And this also pertains to civilian aviation as well. It tends to be the recurring theme for all aviation systems. So this is what we're gonna do. We're gonna apply HVACs to classify unsafe acts. We have this handy dandy little book that all investigators and investigation boards have. And this is how we frame findings and recommendations so that, again, the folks who have the means to change systems can do so, okay? Talking about leaders, commanders in our case, so that once again, we're calling for prevention of injuries, prevention of the next set of mishaps. And this is how this all began, right? At least as a unified DOD effort 2003, the Secretary of Defense challenged the services to reduce mishaps that turned into the Joint Staff Safety Council. And then finally, a few years later, you have a Department of Defense instruction that mandates data to include human error data using common human error categorization system. Issue there before that time was that, again, the services were using the same system, I would say, or variations thereof, but at the level of the Secretary of Defense, you cannot compare, say, Army mishaps to Navy mishaps to Air Force mishaps. So this kind of puts everything in the umbrella of the Secretary of Defense or the Department of Defense so that we can, or they at that level, can then compare mishaps and trends a little bit more in terms of apples to apples as opposed to apples to oranges. So the human factors analysis system, it works at a few different levels. For the safety officers and the investigation team, that facilitates the analysis of human factors, and it provides a framework, provides the structure to the actual investigation, assists in the development of interview questions, and then, again, at the end, which is what we all look for, is framing recommendations that would provide impact, okay? Not just the same old recommendation over and over again. For leaders at the unit level, again, the intent is to prevent the next mishap. So for leaders, the lessons learned during a mishap investigation can then drive further risk management processes for mission planning in the future. And like I said before, for the DoD, at the DoD level, it provides a standard mean of looking at trends and doing research across the DoD. So here it is. This is the one slide that you all need, or that should be, hopefully, familiar to at least the flight attorneys in the group. Again, this is adapted from James Reason, also from Wigman and Chappelle. And this is the model that we follow, and that's been in place for a significant period of time. So the famous Swiss cheese model, you line up all the Swiss cheese slices, and you can actually create a hole. When there is a hole in the Swiss cheese block, that's when unsafe acts can then turn into mishaps. So in a nutshell, we're gonna look into all these slices one by one, but again, in a nutshell, this is what we're looking at. We call things failures. The unsafe act is considered an active failure, and then everything else is considered a latent failure. And like I said, let's go to these layers, okay? So the active failure, if you look from right to left, the active failure is, or describes what happened. That is what caused the mishap to occur. The last hole in the Swiss cheese block that should, if it's blocked, then you don't have the mishap, right? The active failure is what caused the mishap to occur. That's the what. The latent failures is the factors that attempt, at least attempt to describe why things happened. So we'll look at the level of the individual, training issues, support issues, standards, read that policy, and leader factors that, again, just set the conditions for the unsafe act to occur. So looking at the active failure, looking at the what, correct, the unsafe act can be categorized into one of two slash three types. You got the error, person commits an error, leads to an unsafe act, then a mishap. Those can be further categorized as performance or skill base or judgment or decision errors. The other one is violation. And even though, at least in the Army, that is kind of rare, it's not as rare as you would think. It's not as rare as we actually hope. We've seen that in a few of the more recent aviation mishaps and I'll have to stop there because investigations are still ongoing. So again, performance skill-based errors, judgment errors, and then there's just overt violations or indiscipline. So again, active failures, look at what happened. If you have a finding, you can only assign one unsafe act for a finding. And there's 13 of those, and we'll look a little bit into each one, each and every one. Like I said before, did the mishap person make a performance-based error? And you have in the book, the concerns that pertain to performance-based errors. Let's look at that. These are the 13 that I just talked about. Performance-based errors, judgment, decision-making, and violations. Now, in terms of performance-based errors, at least in the last couple of years, or since I started my job at the company, Combat Readiness Center, the one that I've seen more in terms of performance-based is the last one, either rush or delay necessary action, i.e., again, omission, sorry, commission or omission of something that needed to happen that would have prevented the mishap, and the over-control on the control aircraft vehicle. Again, the majority of mishaps that we see are in terms of vehicles, whether aviation or ground. There's other types of mishaps, and we'll discuss one as an example of these things. But those are the ones that I see that have been more often in findings on safety board reports. Over-control on the control of a vehicle, or rush or delay necessary action. In terms of judgment and decision-making, again, you see four of them there. Failure to prioritize tasks adequately would be one that jumps to mind right now. Because again, the other ones do show up, or have shown up, but again, it's the, what do I do first? And the wrong choice of action. I want to say this is an accident or mishap from a few years back. Again, wrong choice of action. The pilot basically turned the wrong engine off, the one that was working. So if you turn off the one that was working, and the other one wasn't working, then you have no engines, right? And we lost one pilot to that mishap. So those are more or less the more common ones that I've seen in the last few years. In terms of violations, like I said before, there's not a whole lot that can be said about those. Those tend to be a little bit more clear-cut. But they're not zero. Like I said just now, there's been at least one mishap in the recent past that has been attributed to a lack of discipline on the part of the aviators, and both of them perished because of that. So those are what happened. Let's look at the latent failures, which again, attempt to describe why things happen. And I say attempt, I don't say describe. Sometimes there really isn't sufficient information, even when the investigation board digs deep or tries to dig as deep as possible, and we can't find reason why, okay? Sometimes that just happens. The parachuting accident in the last couple of years, that again, all we had to go on was, why was the person parachuting? What was he thinking? What was, why did this happen? And why did he hit the ground so hard? And there wasn't really anything other than speculation at that time as to why things happened the way they happened. Ground vehicles tend to go into that category as well. Aviation is a lot more, there's a whole lot more information in aviation because of data voice recorders and data recorders and all these things. So it's a little bit clearer with aviation, but again, we're talking about ground and other type of investigations. And sometimes we can't really get to the why. And here's your latent failure codes. There are 96 of them. There's just a bunch of them. That's why we have readily available book and you can peruse them and try to figure out what is what. They attempt to describe what happened, okay? You can have one active failure or finding, but you can have as many latent failures as the mishap requires basically for you to make or to understand the process of the event itself. Okay, so in the process, like I said, for investigation board is fairly straightforward. You start from the bottom and you follow the algorithm. It's just like as simple as that, a little bit of which we will do here in a second, okay? So again, trying to understand the failures at the individual level, now we've identified that there was something that somebody did that wasn't safe, okay? Why was that individual involved in that and what were the causes that led him or her to perform an unsafe act? And this is where the medical officers kind of dig deep into these things, okay? Obviously medical or physical conditions, flight surges in the group, you're all there. That's the reason why we have flight physicals, right? Trying to identify those things early and make sure that the individuals are qualified to perform aviation duties. State of mind factors, psychological, psychosocial issues, sensory perception goes to things such as spatial disorientation, which is another one of those things that kind of recur in, that I hear very often in my office. Vestibular, visual, auditory, things that create the misperception on an individual that leads him or her to take an action that causes a miss out, okay? Mental awareness factors, looking at fatigue and other things in there, and then the physical environment or physical factors that affect the individual, at the individual level that affect his or her actions. So let's take that into context. And again, I have an example here. Like I said, I tried to make this not aviation specific because we can adapt this to a number of mishaps or accidents. And in this case, it was a weapons qualification range. Doesn't the Army know what I'm talking about? At least, well, those in the service should know what I'm talking about. And in this particular instance, the event in question happened after the qualification or after the training was completed. So in this case, there's a unit that is completing or that they're doing a weapons training. Everyone in the unit went through their training. They qualify or didn't qualify or whatever. And it was end exercise, we're done. They look back and they see that they have a bunch of ammunition left. So when you have a bunch of ammunition left, you can either turn it in or you can use it. Well, they decided to use it and that's fine. That's not in and of itself an unsafe act. Looks like they had a ton of ammunition because basically the officer in charge let the soldiers know, say, hey, just go through the ammo, shoot it down range and we'll be done. This one soldier in particular, you see there on the second box, he fired 528. Between 526 and 528 rounds in seven, eight minutes. They had a bunch of ammo. That's just one guy, right? And we'll talk about that at the end. The weapon is designed not to exceed 180 rounds and he fired 500 of them. But wait, there's more. They call a ceasefire. The soldier still has ammo in the rifle or in the weapon. They call a ceasefire. He gets up from his position. He puts the weapon on the ground facing sideways. He's facing other people. And we're talking a overheated weapon. He didn't clear it. So it's loaded. And the weapon is actually pointing at other soldiers. So the ammo got cut inside the weapon and the weapon by itself shot out four rounds towards the soldiers that were just standing around, hitting one of them in the abdomen, causing him to die. That's what happens. And we'll look at the finding here in a second. But look at all the things that the investigation found that I've tried to describe. Again, we're looking at no direct supervision, company leadership, unsafe culture, all these things. So how do we put that into something that, again, a senior leader, a senior military leader can use to make recommendations to change things, okay? All right, before we look at the finding, this is step three. We looked at individual failures, right? That's step two. Leader failures is step three. And those can go into unsafe supervision or leader failure. So unsafe supervision, policies in the leadership directly affect the practice or conditions or actions of an individual in research and human error. And the failure of leaders to monitor the mission, monitor the planning, correct inappropriate behavior, and emphasize correct procedures resulting in an unsafe act. So what we found, this is directly from the report in this particular instance, finding one contributing to the mishap. Upon completion of firing 526 rounds through a weapon in seven to eight minutes, a non-commissioned officer failed to prioritize the task of making the weapon safe upon receiving the command of cease fire. NCO failed to clear the weapon extended the bipod and placed the weapon on the ground oriented toward other soldiers on the firing line. As a result of this, the heat from the overheated weapon cooked around and that caused the machine, the weapon to fire off on its own four rounds, one of which struck a soldier in the abdomen and basically kill him. Or again, in terms of individual failures, take a look at what the board attributed that to. So the first paragraph is what happened. This is what I just described, right? The second paragraph then goes into the reasons why things happened. Board attributed the non-commissioned officer's actions to complacency and untreated psychiatric disorder, which are individual level failures. Wait, there's more. And at the leader level says failure to enforce existing standards and an unsafe culture on the range. All right, again, the focus changed from training because they already done the training to just spending the ammo. Yeah, let's spend the ammo. That's basically at the end of the thing. You might say that there's a little bit of get-home-itis after that as well. It's like, hey, just spend everything. And as soon as you're done with that, we can go home. Right? So the individual in question, again, was told fire for 10 minutes, let the weapon cool off 10 minutes and resume firing. Again, it only took him eight minutes to fire 500 rounds. It's insane. The board concluded that the non-commissioned officer ignored the hazards, placed the weapon on the ground oriented towards three older soldiers and was not attentive to the risk associated with this course of action. So that's the complacency part. Okay, the untreated psychiatric disorder comes as a result of ADHD, which the soldier had been diagnosed with a few years earlier, told to follow up with the primary care for treatment and further diagnosis, I would presume. But it's something that the NCO or the non-commissioned officer did not do. So we're talking about untreated ADHD. The board at that time then concluded that the actions were consistent with him having that diagnosis. Okay, leader failures. During the supervision of a live fire exercise, officer in charge and the range safety officer failed to evaluate the risk associated with their decisions in the conduct of training. So they failed to assign line safeties, which you should have one per four firers. Didn't monitor rates of fire and deviated from the approved course of fire. As a result of that, again, an unsupervised soldier shot 520 some rounds in less than 10 minutes that exceeded the capacity of the weapon to pull off. And upon the command of cease fire, that solely oriented the weapon in the direction of others and that caused the mishap to occur. The board concluded the officer in charge and the range safety officer failed to enforce range safety standards. Like I said before, requiring one line safety per four individuals. They didn't use line safeties. Basically they declared that everyone was a safety. They overvalued that individual's capacity to maintain safety and to use quote unquote big boy rules. Talk about being the first female in the air force. Yeah, we're talking about big boy. Well, big boy, big girl, I guess. Hey, that created a situation where no one realized that the weapon was not cleared and that it was not pointing down range. It wasn't pointing the right direction and it was pointing towards a bunch of soldiers. The officer in charge authorized an unnecessary hazard when he directed the soldiers to fire for 10 minutes and wait for 10 minutes. Again, this individual at the right here fired the weapon, overcooked or overheated the weapon in less than 10 minutes. So that policy or that directive didn't really make a lot of sense. And they failed to provide direct supervision of the firing line. Board concluded that both individuals overvalued or overestimated the personal capacity of the soldiers on the range to execute the missions of expanding the ammunition safety. So again, why did they miss that? Yes. Can I just ask real quick, it said full auto. Was that a modified or do they mean three round burst? No, this full auto. Okay. So what year is this? Again, the sort probably is not exactly this particular event happened. It was May of 16 and they stopped when the full auto went to a burst. Yeah. Okay. Whatever happened to the range officer, I'm assuming he got a minimum, a little better reprimand. Do you know the ultimate punitive consequences to the individuals? Okay. So again, in terms of what we do, we have a clear line of demarcation in terms of safety. The safety investigation focuses on changing systems, procedures, and policies, not specifically punishing anybody in particular. Now, having said that, there's a secondary investigation running concurrently with the safety and that particular, that is a punishing investigation. So we try not to look into what they do, but yeah, definitely if there's folks that need to be counsel punished or otherwise, that would be the investigation to do so. In this case, again, I'm not really sure what happened in terms of punishing any of these actors, but at least you would say the individual soldier would have been found at fault because of what the things that he did. And so with those two particular individuals, the officer in charge, that's why they're in charge, and the range safety officer, which again, it was definitely did not perform his duty as- In 1978, when I was company commander of an infantry company, this would have been called a failure of command. Yeah. Absolutely. And this is, you know, instruction, et cetera, you know, I mean, all the other things. You're part of this, but this is a piece of warning. There's no other way to phrase this. You know, there was a safety incident that occurred and resulted in a loss of life because of a failure of command, a failure of leadership to do the things that were necessary. You never should have put the range safety officer without appropriate training up there. The OIC should never have been allowed that command. That soldier should have been caught, should have been released. All of these things were wrong. You talk about the Swiss cheese model. This is, there are so many things that are on here that it is obvious even to the uninitiated. You know, there's the Swiss cheese model. I'm thinking about how many layers fail. So first of all, when you're a range officer, everybody should have a safety buddy. Right. So one person's fired and the other's observing. So the second he laid that rifle sideways, that buddy had some carbon negligence too. The NCO was lax, the OIC was really lax. You know, and I almost feel like I need to apologize to my fellow infantry officers. Is that, you know, Dr. Berker would say, that's just, that's poor leadership at every level there. That would be 15-6. Yes, absolutely. What you're talking about would be an impunity in the 15-6's office. That would be UCMJ. No, I'm still thinking that there's a difference between this, but, you know, there's a difference between the safety issues, I understand, and the leadership command issues. And the leadership command issue resulted in the safety issue resulted in the death of a soldier, for which there is no excuse here, period. This is a court martial offense. And to listen to the way you're phrasing this, I understand in part, but, I don't know. So this is, so you're going to, people are going to die in training. You're not going to have people dying in combat. It's not going to get you. Well, again, the focus here is on safety, not on punishing. So I'm with you. And this is, again, on the parallel investigation, that's what should have happened. I don't have that detail, those details to tell you about. But in the safety realm, then we say, how can we avoid this from happening again? All right, so the condition would be what? Train your OIC or train your range unit safety off. Okay, great. But they should have been trained before. Why did that not happen? And what can we do? What other things we can actually put into the system to ensure that, again, OIC is trained, RSO is trained, NCOs are trained, and so on and so forth. So the intent is to figure out, not culpability, but figure out how we can not do this again. Patrick? Can you mention safety privilege and how that's intertwined with the difference between the two investigations, safety privilege? Okay, so the term or the item of safety privilege on the safety investigation is just, actually just, well, I don't even know that NTSB would take safety privilege as much, because again, anyways, let me go back to the safety privilege thing. Again, the intent is for safety. The intent is to prevent the next mishap, the next injury, the next accident. So the investigation provides witnesses and folks with information about the privacy or the confidentiality for them to express whatever they feel the safety board needs to look at in order to make recommendations and do the findings. That confidentiality comes because it's a non-punitive investigation. So we just want to figure out what happened and why it happened so we can avoid things. The UCNJ part or the punishing investigation is, again, it's a legal process. You have to be briefed before you start witness, before you question your witnesses. And it's everything and everything will be used against you in a court of law type of situation. And if you want to stop here and get a lawyer, do that before we begin, blah, blah, blah, all that stuff. And not the investigation, neither investigation, it's we can talk to one another. So the safety investigation can only share things that are factual with the UCNJ investigation, i.e. if there's a picture, there's a picture of a person or something that the investigation on the UCNJ side needs or wants to have, they can produce that. If the photo or the information has analysis on it, then that is safety confidential information, and that is only for the safety. So again, those two investigations run concurrently, but they typically don't talk to one another. Because again, the information are the, the focus is different for each one. Makes sense? But you're on point, okay? That's your role for this. I'm not saying no. I'm trying to look at the safety portion of this. All right, great. Any other questions? And this is great. You know, bring it up if you have any questions. All right, so why did the soldier, what did the NCO, why did that person commit an unsafe act? Like I said before, the board looked at these factors, and you could probably fit the ADHD in one of two of these categories. Either a state of mind, which deals with psychological disorders, in this case, untreated ADHD, or a medical or physical condition. Again, those tend to be ADHD, untreated ADHD, tend to be categorized under state of mind. Step four, let's, yeah, we, step one was defining the unsafe act. Number two was looking at the failures of the individual. Number three was looking at the leaders. Number four was then, is then looking at support failures. Okay, and actually you already mentioned that. Yeah, you didn't have enough gas masks to go around, right? That's a support failure of the resource problem. That is, again, if you were in the Army at some point in your life, you've heard the doing more with us, right? That is a support failure. All right, support failures, research management processes, policies, that directly or indirectly influence system safety and results in inadequate management. Look at the items there, management, allocation, maintenance of organizational resources, monetary and equipment and facilities. So if you don't have enough of what you need, you're looking for, you know, your risk goes up, right? You're looking for something to go wrong. If you have the wrong stuff, you may be looking at, you know, your risk going up as well. So not just because you have a lot of the one thing doesn't mean that that is the thing you actually need, right? So you're still having a issue with the lack of support and the wrong equipment. Okay, cost cutting, lack of funding. That's kind of like the environment I've lived in since I've joined the Army. All right, and that includes personnel, okay? Units manned at 80%, and you may have the right 80% combination, or you may not have the right 80% combination. If you need a surgeon in your group, but they didn't give you a surgeon, but they gave you an extra family medicine guy, does that mean you're good to go? Not like, correct? So personnel issues, funding, like I said, equipment, facilities, so on and so forth. The technological environment factor comes into design factors or automation that results in human error. And this is a range of things. In aviation, if the pilot doesn't know how to operate autopilot, or if there's a new system that hasn't had all the bugs worked out yet, then that's a technological environment issue, okay? Or if, well, I'll stop with the Boeing thing. So step number four, training failures. Talked about that with the OIC and the range safety officer, right? There was a training failure there, training that is not correct, incomplete, or insufficient for the individual to perform the task to standard. That is training failure. And finally, a standards failure. Is the policy incorrect? Is the policy unclear, impractical, or inadequate? The SOPs, are there loopholes, are there holes in the policy that basically allow people to do the wrong thing? And at least in the Army and in the Marines, I know that. Just let the soldiers loose and they'll find them holes. They'll find the loopholes. No offense to anyone. All right. So did the organizational written standards or written policy at any level create an unsafe situation? So those are the categories that HVACS looks at. Again, providing a structure to the investigation. The investigation also looks obviously at material and environmental factors, but since human factors is such a big issue in mishaps, you know, we look deep into human factors as opposed to just giving you a, as a result of human error. Okay, we need to make that more granular and try to look at the factors that actually cause that mishap to occur. And that is all I have for you. And asking for any questions or comments. So the Navy, the reasons was Chief's Theory and the Navy with the Wingman and Chappelle was a really big, huge deal. And that became from the Navy, it went up to a DOD-wide. But I heard what happened is the nano codes got so teeny. Okay, so can you tell me what happened with that? Is it reasonable now or where is it at? Well, you're looking at, this is version 8.0. So in terms of, you know, this occurred somewhere between- Get rid of them and get some money, right? Remember that old couple that was leaving, right? Let them know that they have an open mic. Yeah, just tell them you have an open mic. Oh, it's an open mic? I thought it was- Don't do it. Yeah, I am learning some stuff. They saw military stuff today. Yeah, so let them know it was not sensitive to chat. Right. Once again, anyone talking about military stuff, do you have an open mic? No, I'm muted and he's muted now. Oh, okay. So this is version 8.0, like I said. There's been a lot, it started in terms of, at least from seven to eight, which is the type that I've been involved with. It was in terms of trying to simplify all the categories, just like before, special, for an example, special disorientation was one, degraded visual environment was another one, auditory perception issues was one. So basically all of those got jumbled into one big one that has all of the sensory stuff in it. So it's all kind of like the same thing. So same thing for psychological things. Again, there were a number of categories that got consolidated into one category so that it's a little bit easier not only to use, but for folks to look at trends, they can actually look at that and at least across the board, we're not completely there yet, I want to say, because the Navy and the Air Force, and basically every service kind of uses the system just a little differently, but it's way better than it used to be. We're not a hundred percent there yet though. In fact, the Joint Service Safety Council met this week in Pensacola. Really? I got to be here instead, so. Yes. Man, I mean, it looks like it's appropriate for safety officers or like flight crew to do an accident investigation or something like that, but is the intent really to be that this is understandable by a company commander? Because I look at the language, the way it's written, and it's like, oh, these guys are gonna, you're gonna be way over their head in a lot because you're using safety language and they don't use on a day-to-day basis. Well, the way I see it for the, and actually the process of reporting these investigations and these causes and recommendations, all these things, the company commander or the low-level, yeah, the low-level supervisor is probably the farthest from anybody's mind, to be honest. Because again, these things get reported to the UCMJ authority. So it typically is an admiral or general. And then typically one of the recommendations that come out of the board is, well, disseminate the findings of the board so that everyone down the chain can actually learn from this. Does that happen in all cases? Does it happen in half the cases? I would probably suggest that it doesn't, but that's what the recommendation is. Now, the other thing that could get to a company level officer would be, once policies and procedures are changed or modified or the training is changed so that we can get that lesson learned incorporated into a training environment, that's when the company commander can actually get that. Maybe not the same guy. Hopefully he or she learned from the actual mishap. The other thing that is very common that I get asked is, how do we report these things? Because again, there's a lot of safety information in it. And it's not, it gets reported and it's public information, it's just sanitized a lot. So it comes out in different publications that the Combat Readiness Center publishes monthly. If you look at the Combat Readiness Center website, you can actually see the reports. Again, it's just very sanitized to the point of, a reading could be a Black Hawk crashed with two fatalities and the result of the investigation was that the pilot in control experienced special disorientation. But, okay, not a whole lot of detail because all of this information is confidential. Well, for this kind of event, is the scope of it too large to be in the typical disease and non-viral injury report? It would be. My experience with DNVI is that, again, that comes out of routine visits, if you will, sick call and health clinic type of things. A mishap in and of itself would not fall under that, guys. Neither would suicide events, I think. That doesn't go into DNVI, I'm not really, it does not. Yeah. Does the safety analysis have a result in there being a case study and a lesson plan that's prepared for young lieutenants in training? Yes. I mean, to me, this ought to go to building four at whatever they renamed Fort Benning. And this ought to be- Fort Moore. Fort Moore ought to be a good range safety officer. And this is what happened. If you get lax or lazy, or you're looking for the brownie out of the last MRE in your cargo pocket, you're not paying attention. People could die. That's what I got out of it. And the rest of it, and I'll pause after my abysmal reaction, but the rest of it was glossing over that in my mind. And I get that I'm thinking more on the punitive side than the safety side, but unless we take these gross safety mishaps and somehow operationalize that into a case study that could be used for training. Common, well taken. Again, education goes out in different ways. We do train or we try to get with the lieutenants and captains at the captain career course level and try to basically get them aware of, to the point that the actual commanding general, CRC, goes down and briefs at the captain's career courses. They've now that the captain's career courses are somewhat steady in terms of getting a safety brief in there, at least. Looking at ILE, looking at senior service college, looking at basically touching every military professional education point. And that's the officer type. NCO courses, same situation. Unit safety officers, they come, we actually hold 10 courses every year, ground and aviation to have those unit safety officers involved and trained hopefully so that again, we can avoid these things. And this particular vignette came out of one of those ground safety officer courses. I, and again, there's videos and other things that I didn't produce here that are available at the website and you can actually look at this particular incident and describe it and find out what the findings were and what the recommendations were. So again, that's one portion of the education piece that we try to do is it's all incorporated into other courses though. Because again, at this point, I guess the general understands that that is the way of actually reaching these folks as opposed to say, hey, your safety, let's be safe and blah, blah, blah, blah, blah, blah. Patrick. We're still on time, we have five minutes. With regard, so it's a great question, right? So we have the safety data, so like we have safety points of interaction with the institutional army at those officer and NCO schools, but how does this particular case feed in to the education system so that we don't do this again? So in theory, this is sort of army specific, but if the other services and I think the national response has similar mechanisms, you take a mishap and you investigate it and you determine the factors are. So as preventative medicine specialists, we look at large systems, right? Less individual cases, how do you take a case and add it in the system? So you take the safety mishap and you divide it into these categories and you find these causative factors and you feed the causative factors into the database, you put the calculator and you figure it out, right? That's what the safety center does. The safety center then feeds all of the education centers, the centers of excellence, they're called now, and AMA or GENMERS, the Uro-Medical Activity or some ground equivalent. They look at standards and see if any of those things were caused by a lack of standards. Those lack of standards are then put into policy, which AMA also does. So also a ground policy or infantry policy or like guarantee a number of policies for a lot of things here too, right? So those go into the schoolhouse and they're educated and then they come back. So they're fed into the schoolhouse, they're fed into policy, they're fed into education and then it all works its way back through this system where each of these stone pipe organizations are actually supposed to be synthesized so that exactly what you're looking for at this particular incident, it goes through the calculator, it comes out the way they prevent future hazards.
Video Summary
Colonel Ribeiro, an aerospace medicine specialist and command surgeon at the U.S. Army Combat Readiness Center, discusses his work in mishap investigation, aiming to prevent future incidents by focusing on human factors analysis. A striking 85% of Army fatalities are due to human error. Ribeiro introduces the Human Factors Analysis and Classification System (HFACS), which is used to identify failures at different levels, ranging from individual errors to organizational deficiencies. This method emphasizes the Swiss cheese model, where multiple failures align to result in a mishap. Through a detailed case study of a training mishap involving a negligent discharge of a weapon, Ribeiro illustrates how contributory factors included individual complacency, untreated psychiatric disorders, and leadership failures. This particular mishap, he explains, demonstrates how systemic failures in training and leadership can lead to tragic outcomes, underscoring the importance of addressing both active and latent failures. Ribeiro also highlights how these investigations' findings are used not just to assign blame but to improve safety policies, training, and risk management practices to prevent future incidents.
Keywords
aerospace medicine
mishap investigation
human factors analysis
Army fatalities
HFACS
Swiss cheese model
training mishap
systemic failures
safety policies
×
Please select your language
1
English