Inevitable Waffles [Ohio]

Mid 30’s IT/Medical Device support and quality guy. I like cycling, video games, and singing.

  • 0 Posts
  • 30 Comments
Joined 1 year ago
cake
Cake day: June 1st, 2023

help-circle











  • It depends on how the company goes about it. The larger the company, the more established the HR department. They may use their HR platform to conduct the check which may find any and everything. The smaller companies may only check recent background with a local firm. Price is the name of the game. The more in-depth the background check, the more it costs. If you are going to work in a bank or with kids, be prepared to for the company/school to use the state equivalent of the FBI. For mom and pop shops, they may just take your word on the application. If you see a national HR platform like Paycom, then the results can vary depending on the package the company purchases.

    I just realized I didn’t answer your question though. The main issue of using data brokers is that you as in the employer, for the most part, can’t or are legally dissuaded from using them. We can only use official records to judge your trustworthiness. Things like data brokers are a grey area. It’s not legally admissible in a background check by most EEOC standards, but people use any system they want. It’s on the job seeker to prove they were discriminated against.

    As someone who hires people regularly, I only use the information provided by the HR platform. I don’t google people because I wouldn’t want that to happen to me. Other people may not have the same compunctions.

    Edit: Actually answering the damn question.




  • You can doubt all you like but we keep seeing the training data leaking out with passwords and personal information. This problem won’t be solved by the people who created it since they don’t care and fundamentally the technology will always show that lack of care. FOSS ones may do better in this regard but they are still datasets without context. Thats the crux of the issue. The program or LLM has no context for what it says. That’s why you get these nonsensical responses telling people that killing themselves is a valid treatment for a toothache. Intelligence is understanding. The “AI” or LLM or, as I like to call them, glorified predictive textbars, doesn’t understand the words it is stringing together and most people don’t know that due to flowery marketing language and hype. The threat is real.