Releases: Exabel/python-sdk
Releases · Exabel/python-sdk
Version 4.4.0
Version 4.3.0
Added
- Added option
--case-sensitive-signals
toload_time_series_from_file
for preserving case sensitive signals. Note that this requires standard column headers to be in lowercase.
Version 4.2.0
Added
- Support for searching for folder items in the Library API.
- Support listing relationships with script without specifying from or to entity.
- Support for folder descriptions.
- Support additional prediction model run arguments.
Fixed
- Handle signal lowercasing for long form files.
- Time series import improvements in error, memory and result handling.
Version 4.1.0
Added
- Support for data export from AWS Athena.
- Batched query support in ExportAPI.
- Support for TagService in AnalyticsAPI.
- GCP Service Account credentials can be specified as string with
--credentials-string
argument ExportAPI.signal_query
now has the ability to specify resource_names.- New method in relationship_api to list all relationships of a given type.
- All pageable list methods now have an equivalent utility method
get_<resource>_iterator
which can be called to get an iterator that can page through all available resources. E.g.list(client.entity_api.get_entity_type_iterator())
. --skip-validation
argument to time series loaders to avoid validation if set. Can speed up reading of time series from file, if the input file is pre-validated.- Support batch upload for all resource types. New argument
--batch-size
specifies how many data rows to include per batch. - Allow timeseries to be loaded to the global entity (
entityTypes/global/entities/global
).
Changed
- Default login timeout for Snowflake connections set to 15 secs.
- Default timeout for API requests in the client set to 15 mins.
- Max metadata size for gRPC headers increased to 16MB.
- Multiple user tokens can now be stored locally when using the ExportAPI with the new
--user
argument.
Fixed
- Improvements for faster parsing of time series in file upload.
- General improvements to logging to remove noise.
Version 4.0.0
Added
- Ability to set explicit entity types and identifier type when uploading relationships and time series.
Changed
- Bump
protobuf
dependency to >=4. - When uploading time series data, use the new batch endpoint to improve upload speed.
- When uploading entities, the second column (if present) will be used as the display name as default when not providing an explicit display name column.
- Column names when uploading entities, time series and relationships are now case insensitive. Helpful when exporting and loading data from a SQL-based system.
Note: This is a breaking change if your data pipeline deletes and recreates signals, your signals have mixed case names, and you refer to these signals in DSL expressions. - Reading results from a SQL query is now performed in batches of 100k rows.
Deprecated
- The argument
--namespace
is ignored, as the namespace is now retrieved from the API when uploading entities, relationships and time series. - The arguments
--entity-from-column
and--entity-to-column
have been replaced with--from-entity-column
and--to-entity-column
respectively when loading relationships. - The argument
--name-column
has been replaced with--entity-column
when loading entities.
Removed
- Python 3.6 is no longer supported.
- The REST / HTTP implementation of the client.
Fixed
- Include missing entity types metadata received from the API when working with signals.
- Improve log output when an unknown format is encountered when uploading time series.
- Gracefully handle time series files with no data.
Version 3.9.0
Added
- Support for importing Excel files
- Support for uploading time series to the global entity
- Support for API key authentication in the
ExportApi
class and export data script - Utility script for checking mapping of company identifiers
Changed
- Duplicate data points are now dropped before uploading.
- Time series with duplicates in the index (more than one data point per entity, signal, date and known time) are now dropped before uploading.
Fixed
- Include missing entity types metadata received from the API when working with signals
Version 3.8.0
Added
- Specifying entity from and to column are now optional when uploading relationships. The default is to load relationships from the entities in the first column to the entities in the second column.
- Method to get accessible namespaces.
- XASE as potential market in the create entity mapping script.
- Excel (.xslx) as an output format to
FileWriter
. - Option to force reauthentication of the user when using the Export API.
Changed
- Slightly improved logging when running SQL queries and uploading resources.
- Removed argument
--description
from delete relationship script (not in use).
Fixed
- Providing empty identifiers when uploading resources now raises a user friendly error.
- Raise a proper exception when attempting to upload a time series with invalid data points instead of failing silently.
Version 3.7.1
Fixed
- Added dependency restriction for
protobuf
Version 3.7.0
Added
- Support for setting hosts on the format 'hostname[:port]' in the command line scripts
- Operations for managing library folders
- Support for Bloomberg symbols in CSV upload scripts
- Support for exporting data from Google BigQuery
- Import time series endpoint
Changed
- Refactor SQL reader functionality
- Refactor all print statements to logging
Deprecated
- Deprecate the REST/HTTP client
Fixed
- Fix raised exception when derived signal unit is not set
Version 3.6.0
Added
- Support for exporting data from Snowflake to a file.