Memory is one resource in your computer which your programs will use. Many languages ease the management of memory by being 'Garbage Collected' which means that rather than the software author having to keep track of memory and release it when they are finished with it, the language itself keeps track and does that on their behalf. Garbage collection schemes are both old (original LISP had gc) and the subject of modern research.
There are many schemes for garbage collection of memory, from simple but effective ones such as Perl 5's reference counting or Python's use of both reference counting and also cycle cleanup, through to the wider field of tracing garbage collectors.
Many programming languages include one or more garbage collection approaches in their memory management runtime, indeed almost every language except C/C++/Obj-C includes some form of dynamic memory management by default.
However, garbage collection is not a panacea. Indeed it can introduce problems to your software which you would not normally encounter if you were managing everything explicitly yourself. For example, depending on the scheme in use, your program may pause for unpredictable periods at unpredictable intervals. If you need your program to always respond smoothly to external events, this kind of behaviour can be a show-stopper. Of course, there's plenty of mechanisms in place to mitigate this kind of problem and your chosen language will have interfaces to the garbage collector to allow you to tell it when you want it to do work, but still, it can be a pain if you're not expecting it.
Garbage collection schemes can mean that your program ends up "using" much more RAM than it is, in fact, using. On large computers this might not be an issue, but if you're writing for more resource constrained situations, or working with super-large datasets, then this might turn out to be a problem too. Again, working with your language runtime's tweaking APIs can help you mitigate this problem.