resource data #265
-
Hi alvarobartt! Thank you for investpy I was happy when I found it. I would like to ask you how did you generate/save the .csv files with the stocks/funds etc data? As I see investpy use the tickers from the saved .csv files, I'm right? As I compared the number of US stocks on investing.com I found x3 more stocks. I check some stocks and my guess is that the stocks from the OTC market aren't saved in the .csv. And I also found some stocks which are missing from the .csv. I tried to scrap the tickers from the website but I wasn't able to do that (with BeautifulSoup). Did you tried it? I would like to use investpy with all stocks from the US and EU markets, is it possible? Thank you! Best regards, |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 2 replies
-
Hi @beze20! 😄 I'm glad you liked investpy and found it useful. Currently the script for retrieving the data is private, but it basically sends requests to Investing.com internal API so as to retrieve all the meta information from the website. You can manually add them in the CSV files including all the fields or you can open an issue describing which ones are missing with the website URL for each of them attached. Thank you also for using this Dicussions page! Feel free to follow me both on GitHub and on Twitter at @alvarobartt 👍🏻 |
Beta Was this translation helpful? Give feedback.
-
Hi Alvarobartt! First of all I really appreciate your project. I've learned a lot reading your code and it is really useful for a project I'm currently working on. Again thanks a lot for your work! |
Beta Was this translation helpful? Give feedback.
Hi @beze20! 😄
I'm glad you liked investpy and found it useful. Currently the script for retrieving the data is private, but it basically sends requests to Investing.com internal API so as to retrieve all the meta information from the website. You can manually add them in the CSV files including all the fields or you can open an issue describing which ones are missing with the website URL for each of them attached.
Thank you also for using this Dicussions page! Feel free to follow me both on GitHub and on Twitter at @alvarobartt 👍🏻