There have been a lot of different documentary films over the years, and many have had huge impacts on people's understanding of the food they eat and how it effects them and the planet.
Below is a list of the films I feel everyone should watch at least once, and I think you would struggle to not be vegan after them. They remove the barriers between where our food comes from and why we should collectively change and adapt.
In no particular order, here is the list:
Seaspiracy
Eating our way to extinction
Game Changers
Cowspiracy
Dominion
What the health
Food Inc.
Before the flood
There are many others and many that I haven't even seen yet, but once you open the door to what eating animal products means for the environment, all animals (farmed or wild), our health, and human extinction, it's pretty hard to continue.
Give them a try and despite how horrible the content can be, you wont regret watching them.