No, that is not the correct definition of a prime. You can find the right one on Wikipedia.
The reason that 1 is not a prime is to preserve unique factorization. A basic fact from number theory is that every integer uniquely factors as a product of primes, say 21 = 7 times 3. If 1 were a prime, we'd also have 21 = 7 times 3 times 1. That's bad.
In more general number rings, other units (e.g. i) are also not considered primes for the same reason. The exclusion is not arbitrary.
As a complement of your comment:
For more information, see Unique Factorization Domains (https://en.wikipedia.org/wiki/Unique_factorization_domain) for the generalization of how prime factorization work on things other than integers.
The reason that 1 is not a prime is to preserve unique factorization. A basic fact from number theory is that every integer uniquely factors as a product of primes, say 21 = 7 times 3. If 1 were a prime, we'd also have 21 = 7 times 3 times 1. That's bad.
In more general number rings, other units (e.g. i) are also not considered primes for the same reason. The exclusion is not arbitrary.