Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update _predictor.py #2

Open
wants to merge 38 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
38 commits
Select commit Hold shift + click to select a range
d481c81
Update _predictor.py
hiok2000 Jan 8, 2020
d647aa4
prob -> log_prob (#556)
lostella Jan 10, 2020
c58cf39
remove dataset class in favor of alias (#560)
lostella Jan 15, 2020
55ee968
Removed timeouts from tests. Added hard timeout of 30seconds for all …
Jan 16, 2020
6338eb2
Fix bug in cdf method of piecewise linear distributions (#564)
lostella Jan 20, 2020
824159c
fixed taxi dataset cardinality (#552)
kashif Jan 20, 2020
1171f9e
fix item_id field in provided datasets (#566)
lostella Jan 21, 2020
b056435
Fixed docstring. (#568)
AaronSpieler Jan 22, 2020
79685c1
add missing shape info to scale (#569)
lostella Jan 27, 2020
76bbc6c
add link to master docs, add docs badges (#570)
lostella Jan 27, 2020
f44304d
N-Beats implementation for GluonTS (#553)
AaronSpieler Jan 29, 2020
611a1fd
Moved gp module to be part of gp_forecaster. (#572)
Jan 29, 2020
2f10256
clean up lifted operations, add pow operation (#571)
lostella Jan 29, 2020
95a9d78
Removed expand_dims when reading in time-series values. (#574)
Jan 30, 2020
8b7775f
Pandas v1.0 (#576)
Jan 30, 2020
6693927
Enhanced error message for input time-series fields. (#577)
Jan 30, 2020
42f512a
fix trend model to work in symbolic mode (#578)
lostella Jan 30, 2020
f542207
Fix Dockerfile to use Python 3.7. (#579)
Jan 30, 2020
ae0be7b
disabling test for ensemble nbeats because of timeouts (#580)
lostella Jan 30, 2020
2889be4
Fix for symbol block serialization issue (#582)
canerturkmen Jan 30, 2020
5d13a3e
Update extended_tutorial.md (#590)
mehdikchouk Jan 31, 2020
e28cb5b
Change causal conv1d default activation to match the doc (#586)
ehsanmok Jan 31, 2020
6f0c16a
fix for symbol block import backward compatibility (#591)
canerturkmen Jan 31, 2020
8a62e00
Updated N-Beats defaults. (#588)
AaronSpieler Jan 31, 2020
b6ec6a0
More elegant stop_gradient fix for sMAPE. (#593)
AaronSpieler Feb 5, 2020
90587ba
Fixed minor bugs in GluonTSFramework. (#585)
AaronSpieler Feb 7, 2020
357c863
fix mean_ts method (#624)
lostella Feb 12, 2020
1677728
Updated black. (#625)
Feb 12, 2020
c1e178b
Merge branch 'master' into hiok2000-patch-2
lostella Feb 12, 2020
90a35bf
Mypy update (#628)
Feb 12, 2020
d187370
Merge branch 'master' into hiok2000-patch-2
Feb 12, 2020
311ce4e
Update identity.py (#627)
hiok2000 Feb 12, 2020
bc24d1c
Update _base.py (#630)
changebo Feb 12, 2020
cddd983
Refactored DataLoader. (#619)
Feb 13, 2020
24af514
Removed `dataset.constants.py`. (#601)
Feb 13, 2020
0886ce4
Removed logger.basicConfig calls. (#635)
Feb 14, 2020
8fb5808
Update src/gluonts/model/r_forecast/_predictor.py
jaheba Feb 14, 2020
3a91fbf
Merge branch 'master' into hiok2000-patch-2
Feb 14, 2020
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
FROM python:3
FROM python:3.7

ADD . /gluonts

Expand Down
11 changes: 9 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,9 @@
# GluonTS - Probabilistic Time Series Modeling in Python

[![PyPI](https://img.shields.io/pypi/v/gluonts.svg?style=flat-square)](https://pypi.org/project/gluonts/) ![GitHub](https://img.shields.io/github/license/awslabs/gluon-ts.svg?style=flat-square)
[![PyPI](https://img.shields.io/pypi/v/gluonts.svg?style=flat-square)](https://pypi.org/project/gluonts/)
[![GitHub](https://img.shields.io/github/license/awslabs/gluon-ts.svg?style=flat-square)](./LICENSE)
[![Static](https://img.shields.io/static/v1?label=docs&message=stable&color=blue&style=flat-square)][stable docs url]
[![Static](https://img.shields.io/static/v1?label=docs&message=latest&color=blue&style=flat-square)][latest docs url]

GluonTS is a Python toolkit for probabilistic time series modeling,
built around [Apache MXNet (incubating)](https://mxnet.incubator.apache.org/).
Expand All @@ -9,9 +12,13 @@ GluonTS provides utilities for loading and iterating over time series datasets,
state of the art models ready to be trained, and building blocks to define
your own models and quickly experiment with different solutions.

* [Documentation](https://gluon-ts.mxnet.io/)
* [Documentation (stable version)][stable docs url]
* [Documentation (latest)][latest docs url]
* [Paper](https://arxiv.org/abs/1906.05264)

[stable docs url]: https://gluon-ts.mxnet.io/
[latest docs url]: https://gluon-ts.s3-accelerate.dualstack.amazonaws.com/master/index.html

## Installation

GluonTS requires Python 3.6, and the easiest
Expand Down
10 changes: 5 additions & 5 deletions docs/examples/extended_forecasting_tutorial/extended_tutorial.md
Original file line number Diff line number Diff line change
Expand Up @@ -337,8 +337,8 @@ We can easily create the train and test datasets by simply filling in the correc
```python
train_ds = ListDataset([{FieldName.TARGET: target,
FieldName.START: start,
FieldName.FEAT_DYNAMIC_REAL: fdr,
FieldName.FEAT_STATIC_CAT: fsc}
FieldName.FEAT_DYNAMIC_REAL: [fdr],
FieldName.FEAT_STATIC_CAT: [fsc]}
for (target, start, fdr, fsc) in zip(target[:, :-custom_ds_metadata['prediction_length']],
custom_ds_metadata['start'],
feat_dynamic_real[:, :-custom_ds_metadata['prediction_length']],
Expand All @@ -350,8 +350,8 @@ train_ds = ListDataset([{FieldName.TARGET: target,
```python
test_ds = ListDataset([{FieldName.TARGET: target,
FieldName.START: start,
FieldName.FEAT_DYNAMIC_REAL: fdr,
FieldName.FEAT_STATIC_CAT: fsc}
FieldName.FEAT_DYNAMIC_REAL: [fdr],
FieldName.FEAT_STATIC_CAT: [fsc]}
for (target, start, fdr, fsc) in zip(target,
custom_ds_metadata['start'],
feat_dynamic_real,
Expand Down Expand Up @@ -693,7 +693,7 @@ print(f"Start date of the forecast window: {forecast_entry.start_date}")
print(f"Frequency of the time series: {forecast_entry.freq}")
```

We can also do calculations to summarize the sample paths, such computing the mean or a quantile for each of the 48 time steps in the forecast window.
We can also do calculations to summarize the sample paths, such as computing the mean or a quantile for each of the 24 time steps in the forecast window.


```python
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -232,7 +232,7 @@ As recipes are just lists of expressions that evaluated sequentially, recipes ca

```python
scaling = [
("scale", rcp.RandomUniform(0, 1000)),
("scale", rcp.RandomUniform(low=0, high=1000, shape=1)),
("z", "scale" * rcp.Ref("unscaled"))
]

Expand Down
2 changes: 1 addition & 1 deletion examples/gp_synthetic_example.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@

# First-party imports
from gluonts.kernels import RBFKernel
from gluonts.gp import GaussianProcess
from gluonts.model.gp_forecaster.gaussian_process import GaussianProcess

# Third-party imports
import mxnet.ndarray as nd
Expand Down
2 changes: 2 additions & 0 deletions pytest.ini
Original file line number Diff line number Diff line change
Expand Up @@ -5,3 +5,5 @@ markers =
gpu: mark a test that requires GPU.
integration: mark an integration test
skip_master: mark a test that is temporarily skipped for mxnet master validation.

timeout = 30
4 changes: 2 additions & 2 deletions requirements/requirements-setup.txt
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
black==19.3b0
mypy==0.630
black==19.10b0
mypy==0.761
setuptools>=40.1.0
setuptools_scm>=3.3.3
2 changes: 1 addition & 1 deletion requirements/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ boto3~=1.0
holidays>=0.9,<0.10
matplotlib~=3.0
numpy~=1.14
pandas>=0.25,<0.26
pandas~=1.0
pydantic~=1.1
tqdm~=4.23
ujson~=1.35
2 changes: 1 addition & 1 deletion src/gluonts/block/cnn.py
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ def __init__(
channels: int,
kernel_size: int,
dilation: int = 1,
activation: Optional[str] = "relu",
activation: Optional[str] = None,
**kwargs,
):
super(CausalConv1D, self).__init__(**kwargs)
Expand Down
19 changes: 8 additions & 11 deletions src/gluonts/core/component.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,10 +35,7 @@
# Relative imports
from . import fqname_for

DEBUG = os.environ.get("DEBUG", "false").lower() == "true"

logger = logging.getLogger()
logger.setLevel(logging.DEBUG if DEBUG else logging.INFO)
logger = logging.getLogger(__name__)

A = TypeVar("A")

Expand Down Expand Up @@ -496,17 +493,17 @@ def num_gpus(refresh=False):
return NUM_GPUS


@functools.lru_cache()
def get_mxnet_context(gpu_number=0) -> mx.Context:
"""
Returns either CPU or GPU context
"""
n = num_gpus()
if n == 0:
logging.info("Using CPU")
return mx.context.cpu()
else:
logging.info("Using GPU")
if num_gpus():
logger.info("Using GPU")
return mx.context.gpu(gpu_number)
else:
logger.info("Using CPU")
return mx.context.cpu()


def check_gpu_support() -> bool:
Expand All @@ -516,7 +513,7 @@ def check_gpu_support() -> bool:
"""
n = num_gpus()
logger.info(f'MXNet GPU support is {"ON" if n > 0 else "OFF"}')
return False if n == 0 else True
return n != 0


class DType:
Expand Down
38 changes: 0 additions & 38 deletions src/gluonts/core/log.py

This file was deleted.

2 changes: 1 addition & 1 deletion src/gluonts/core/serde.py
Original file line number Diff line number Diff line change
Expand Up @@ -560,7 +560,7 @@ def decode(r: Any) -> Any:
# r = { 'class': ..., 'args': ... }
# r = { 'class': ..., 'kwargs': ... }
if type(r) == dict and r.get("__kind__") == kind_inst:
cls = locate(r["class"])
cls = cast(Any, locate(r["class"]))
args = decode(r["args"]) if "args" in r else []
kwargs = decode(r["kwargs"]) if "kwargs" in r else {}
return cls(*args, **kwargs)
Expand Down
11 changes: 7 additions & 4 deletions src/gluonts/dataset/artificial/recipe.py
Original file line number Diff line number Diff line change
Expand Up @@ -165,16 +165,19 @@ def __rsub__(self, other):
return LiftedSub(other, self)

def __mul__(self, other):
return LiftedMul(self, other, operator.mul)
return LiftedMul(self, other)

def __rmul__(self, other):
return LiftedMul(other, self, operator.mul)
return LiftedMul(other, self)

def __truediv__(self, other):
return LiftedTruediv(self, other, operator.truediv)
return LiftedTruediv(self, other)

def __rtruediv__(self, other):
return LiftedTruediv(other, self, operator.truediv)
return LiftedTruediv(other, self)

def __pow__(self, other):
return LiftedBinaryOp(self, other, operator.pow)

def __call__(
self,
Expand Down
38 changes: 8 additions & 30 deletions src/gluonts/dataset/common.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@
from functools import lru_cache
from pathlib import Path
from typing import (
cast,
Any,
Callable,
Dict,
Expand All @@ -25,8 +26,6 @@
List,
NamedTuple,
Optional,
Sized,
cast,
Union,
)

Expand All @@ -35,21 +34,18 @@
import pandas as pd
import pydantic
import ujson as json
from pandas.tseries.frequencies import to_offset
from pandas.tseries.offsets import Tick

# First-party imports
from gluonts.core.exception import GluonTSDataError
from gluonts.dataset import jsonl, util
from gluonts.dataset.stat import (
DatasetStatistics,
calculate_dataset_statistics,
)

# Dictionary used for data flowing through the transformations.
# A Dataset is an iterable over such dictionaries.
DataEntry = Dict[str, Any]

# A Dataset is an iterable of DataEntry.
Dataset = Iterable[DataEntry]


class Timestamp(pd.Timestamp):
# we need to sublcass, since pydantic otherwise converts the value into
Expand Down Expand Up @@ -152,21 +148,6 @@ class SourceContext(NamedTuple):
row: int


class Dataset(Sized, Iterable[DataEntry]):
"""
An abstract class for datasets, i.e., iterable collection of DataEntry.
"""

def __iter__(self) -> Iterator[DataEntry]:
raise NotImplementedError

def __len__(self):
raise NotImplementedError

def calc_stats(self) -> DatasetStatistics:
return calculate_dataset_statistics(self)


class Channel(pydantic.BaseModel):
metadata: Path
train: Path
Expand Down Expand Up @@ -393,14 +374,11 @@ def __call__(self, data: DataEntry) -> DataEntry:
value = data.get(self.name, None)
if value is not None:
value = np.asarray(value, dtype=self.dtype)
ddiff = self.req_ndim - value.ndim

if ddiff == 1:
value = np.expand_dims(a=value, axis=0)
elif ddiff != 0:
if self.req_ndim != value.ndim:
raise GluonTSDataError(
f"JSON array has bad shape - expected {self.req_ndim} "
f"dimensions, got {ddiff}"
f"Array '{self.name}' has bad shape - expected "
f"{self.req_ndim} dimensions, got {value.ndim}."
)

data[self.name] = value
Expand All @@ -410,7 +388,7 @@ def __call__(self, data: DataEntry) -> DataEntry:
return data
else:
raise GluonTSDataError(
f"JSON object is missing a required field `{self.name}`"
f"Object is missing a required field `{self.name}`"
)


Expand Down
24 changes: 0 additions & 24 deletions src/gluonts/dataset/constants.py

This file was deleted.

Loading