-
Notifications
You must be signed in to change notification settings - Fork 3
Home
Team Email:- [email protected]
The Moodsphere project aims to enhance the music recommendation experience by incorporating user emotions as a key factor in suggesting personalized playlists. Unlike traditional music recommenders who often rely on static factors like genres and artists, this project goes a step further by dynamically adjusting recommendations based on the user's current emotional state. The goal is to curate playlists that resonate with the user's feelings at a given moment, creating a more engaging and relevant music experience. Using emotion analysis techniques, this project determines the user's present emotional state and suggests music that reflects those feelings. This user-centric strategy aims to produce a more engaging and customized music recommendation experience.
The system maintains user profiles that store historical emotional data and music preferences. By continuously learning from user interactions and feedback, the recommender adapts and enhances its understanding of the user's evolving emotional states over time. This adaptive user profiling ensures that recommendations become progressively accurate and aligned with the user's preferences. The Emotion-Based Music Recommender project seeks to revolutionize music recommendations by recognizing and responding to the dynamic nature of human emotions. By combining advanced emotion analysis techniques with adaptive machine learning models and user-centric design, the project aims to deliver a personalized and emotionally resonant music discovery experience for each user.
- Emotion Analysis: Implement advanced emotion analysis algorithms to identify the user's emotional state based on various inputs, such as facial expressions or text input.
- User Profiling: Create user profiles that store historical emotional data and music preferences, allowing the system to continuously adapt and enhance recommendations over time.
- Real-Time Emotion Detection: Integrate real-time emotion detection capabilities to provide dynamic and responsive music recommendations based on the user's varying emotional states.
- Machine Learning Models: Develop machine learning models to relate emotional states with specific musical attributes, genres, or artists, enabling precise and distinguished recommendations.
- Music Database Integration: Integrate a comprehensive music database that includes metadata such as genre, tempo, and lyrical content to match the user's emotional preferences.
- User Interface (UI): Design a user-friendly interface that allows users to input emotions, view recommendations, and provide feedback, contributing to the refinement of the recommendation algorithms.
- Feedback Loop: Implement a feedback loop mechanism to gather user feedback on the accuracy of recommendations, ensuring continuous improvement and fine-tuning of the recommendation engine.
View Project Description as PDF | Download Project Description as Word Document
Urmil Trivedi Architect/Developer [email protected] [email protected] |
Dhyey Dave Lead Developer [email protected] [email protected] |
Nisarg Bhuva Scrum Master/QA [email protected] [email protected] |
Krushil Sheladiya UI/UX Developer [email protected] [email protected] |
Bhavik Chopra Data Scientist [email protected] [email protected] |
Mahesh Nakka Backend Developer/QA [email protected] [email protected] |
Shane Parmar Backend Developer/Product Owner [email protected] [email protected] |
Vijay Devkate ML Engineer/UI Designer [email protected] [email protected] |
MoodSphere is developed by combining the advancements in fluidity of React for frontend and the reliability of Flask for backend and Firebase ’s real-time database and authentication, which makes the overall functionally more reliable. Additionally, the power of machine learning was used in this application using a CNN model and was trained in such a way that it could analyze the users’ moods and recommend the music accordingly. Development workflows were created, and tracking was done using Jira, and Visual Studio Code was used for coding tasks. GitHub served as the repository for our codebase, ensuring streamlined updates and team collaboration.
HTML5 |
CSS3 |
JavaScript |
Python |
React |
Flask |
Tailwind CSS |
Material UI |
TensorFlow |
Firebase |
VSCode |
Github |
Jira |
-
Watch Deliverable 1 Presentation Video | Click here to download mp4 File
1a. View Deliverable 1 Presentation Slides as PDF
1b. Download Deliverable 1 Presentation Slides as PowerPoint
-
Watch Deliverable 2 Presentation Video | Click here to download mp4 File
2a. View Deliverable 2 Presentation Slides as PDF
2b. Download Deliverable 2 Presentation Slides as PowerPoint
2c. Watch Prototype Video | Click here to download MP4 file
2d. Frontend Source Code | Backend Source Code
-
Watch Deliverable 3 Presentation Video | Click here to download mp4 File
3a. View Deliverable 3 Presentation Slides as PDF
3b. Download Deliverable 3 Presentation Slides as PowerPoint
3c. Watch Deliverable 3 Application Demo Video | Click here to download mp4 File
3d. Frontend Source Code | Backend Source Code
-
Watch Deliverable 4 Presentation Video | Click here to download mp4 File
3a. View Deliverable 4 Presentation Slides as PDF
3b. Download Deliverable 4 Presentation Slides as PowerPoint
3c. Watch Deliverable 4 Application Demo Video | Click here to download mp4 File
3d. Frontend Source Code | Backend Source Code
Team Working Agreement as PDF | Download Team Working Agreement as Word Document
Conceptual Architecture Diagram | Sequence Diagram | Class Diagram | ER Diagram | Context Diagram | State Diagram
View User Stories as PDF | Download User Stories as Excel
View Test Cases as PDF | Download Test Cases as Excel
View Acceptance Criteria as PDF | Download Acceptance Criteria as Excel