__STYLES__

Job Portal Web Scrapping

Tools used in this project
Job Portal Web Scrapping

About this project

Python Code

Overview

The project addresses the need for job seekers to identify trending roles and their locations. It aims to offer personalized and current insights using LinkedIn data. The process involves web scraping, data cleaning, transformation, and visualization for a comprehensive job market overview.

Project Steps

  1. User input & URL generation: The project begins by taking user input for the desired job title and generating a corresponding URL for web scraping.
  2. Web Scraping: To scrape the data from the URL, pythons requests library was used to retrieve the HTML content from the server and store data in a Python object. Beautiful Soup was used to parse the lengthy code and make it more accessible to extract the relevant information.
  3. Create Data frame: Job data, including titles, companies, locations, and levels, is stored in a structured DataFrame.
  4. Data Cleaning: The location field is split into city and state, and entries with missing values are removed for accuracy.
  5. Data Analysis and Visualisation: Utilizing packages seaborn and matplotlib, further analysis and visualisation was carried out to determine the most common job levels being recruited for, understanding the distribution of job levels across different states, identifying the industries that are actively hiring, exploring the relationship between job levels and industries, and examining the association between job functions and job levels.

Conclusion

By offering insights into job market trends, this project empowers job seekers and employers with valuable decision-making information. It combines web scraping and data analysis to provide a clear and informative job landscape view.

Discussion and feedback(0 comments)
2000 characters remaining
Cookie SettingsWe use cookies to enhance your experience, analyze site traffic and deliver personalized content. Read our Privacy Policy.