Hello to everyone,
I have currently been at my new job for just over 90 LONG days. I can honestly say this is, by far, the worst experience of my life. During the interview my boss never told me what was expected of me-meaning what was to come. I took the job leaving a beautiful part-time job behind. I loved loved that job but due to it being part-time I had to take a full-time job. I am single and no family.
During the interview my boss did tell me I would take training classes, but being in charge of the company documents I thought great (since most jobs rarely give training classes anymore). To make a long story short, it turned out to be classes with medical professionals (who had schooling) we had 4 test and a final all in medical world (and I was the only one who never went to school for this I got hired as clerical, I make nothing). I also had to learn my job while being ignored my management. I was given some booklet to study and told during my 8 hours there I could not study. In four weeks we had 4 test and a final and to learn my clerical job. I started having panic attacks at night-I couldn't go to my boss as soon as I was hired I was a cockroach..meaning they wouldn't even look my way. I have since learned people have fled this place in horror.
So my question: has an employer ever lied to you so you would take the job? Have you ever taken a job and realized you made a HUGE mistake?? I feel lately I am seeing very unethical practices by employers lately. Oh and YES I am currently looking, it is so hard to job hunt when you work all day.
Any advice would be great: am I justified in being totally angry at my boss? I heard he gives these test/final to weed out weak employees. I did fail the final but nothing was done (but it caused me great stress I ended up in the hospital as I thought I would be fired and probably homeless) I have NEVER taken a job (an office job paying peanuts) and had test and final and felt blind sided by this.. I am so sad lately-
Thank you for reading my story