business
 






 

Question by  longhorn24 (9)

Do companies have to offer your a bonus?

Why or why not?

 
+7

Answer by  RunawayJim (964)

No, they do not. Companies often do to encourage employees to work harder/ better, but if you are already in a low paying job, then chances are that you will never see a bonus.

 
+5

Answer by  Gabriel (2146)

No. Bonuses are paid out when things are going well. Generally, the terms of getting a bonus are agreed to at hiring.

 
You have 50 words left!