Hello Jihanne, great question! I think I inherently held this value for most of my life, but the need to act on it and speak out about it became more salient once I started pursuing a degree in social sciences. I believe this is the case because so much of what we study addresses trauma, historical movements, and (especially in sociology) inconsistencies of what is considered "truth" and "rights" between different types of people. We go into depth about the reasons that a person of color may be deemed less-than human during something like the African slave trade, or why having the master status of a felon keeps a person from getting a job, housing, government assistance, etc. In one of my classes we paralleled the vilification of Jews leading up to the Holocaust, and how the exact same measures are being exercised against the Muslim population (or even those perceived to be Muslim) in the United States today. I think it is so important to us because we study how these trends in dehumanization, demonization, vilification, and persecution of "others" shift the blame from people group to people group over time, and are usually pursued by those in power. We have an obligation to use this knowledge to try and make things better rather than perpetuate a flawed and discriminatory system.