Hi:
I was wondering if any of you have ever held onto a job where your employers made false promises to you? If so, what were your experiences
Here are some of mine
-Nearly 20 years ago, I worked for a food company at an airport which was a very toxic work environment. Though they decided to stick me on cleaning duties and didn't want me doing anything else, two of the managers told me
1. They were going to give me a raise, which never happened
2 They were going to promote me to manager, which I took seriously. Meanwhile, when I told other co-workers, they confirmed that they were just teasing me
-Nearly 10 years ago, I worked on commission at a commercial real estate firm which both brokers:
1. Jerked me around about paying for real estate school which they made me pay for. Rather, they were going to reimburse me once I got licensed. They also let me go claiming I could not make cold calls without a license due to the supposed by-laws changing
2. They said they were going to pay me and never did
3. In 2018, they called me out of the blue looking for an assistant offering a part time wage. However, they kept pushing back the start date with excuses.