MAP Reduce is a framework initially created by Google read this white paper:
Map Reduce White Paper
It was developed for processing large amount of data (usually stored in a distributed file system like HDFS) to obtain useful pattern from it.
Hadoop is nothing but a collection of individual computers which use commodity hardware for storing large amount of data(usually big data) whose size is larger than a single computer's memory
For example,your computer has a memory of 500gb and mine another 500gb(imagine they are free) and you are provided with a data source of size 800gb
None of our systems are individually capable of holding this data so our idea is to combine the systems to get a total size of 1000gb
Now our data gets stored in this combined distributed file system known as hdfs-hadoop distributed file system
Now we may require to process our data like finding some useful pattern from it for that we use hadoop map reduce as an analytical platform
For more to know about how it works: