Carr is claiming that automation isn’t always a good thing. He starts out strong with pathos while describing the plane crashes with hundreds of people in their carriages. He proclaimed the pilot in the Buffalo incident “did precisely the wrong thing” most likely occurring due to his lack of mental awareness of the situation at hand and his co-pilots. It caused both to react poorly in the short amount of time allotted to correct a simple issue with the plane. Carr feels that the “crash which killed all forty-nine people onboard as well as one on the ground should not have happened”. Policy is brought up in the aspect of questioning who is at fault in the incidents; as previously stated, Carr believes it falls upon the pilots. Value is also brought up in the aspect of human life. Bring up the question, how much do we sacrifice human lives before we realize atomization is causing the end of it? How much do we value human life? This also touches on ethos, should we ethically be allowing the death of “228 crew and passenger” or “killed all forty-nine people on board as well as one person on the ground” because of the lack of awareness due to atomization? Should we ethically enforce more precautionary measure to avoid this situation? The facts he provides, for example, a pilot only has manually flown the plane for three to four minutes due to automation; thus, also appealing to the reader’s logos and possibly making them question why that it. Carr’s target audience for this article, with the techniques he has used, consist of people who are consistently skeptical of advancements in technology to encourage them in their thoughts. Bet, he is also trying to target those who believe automation has a positive impact to convince them that the impact, in the long run, will do more harm than good to the world.