Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add path statistics #15

Merged
merged 4 commits into from
Dec 28, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 8 additions & 0 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,14 @@ book: ## Build static jupyter {book}
poetry run jupyter-book build notebooks --all


.PHONY: nbconvert
nbconvert: ## Convert notebooks to myst markdown
poetry run ./dev/nbconvert

.PHONY: nbsync
nbsync: ## Sync python myst notebooks to .ipynb files - needed for vs notebook development
poetry run ./dev/nbsync

.PHONY: sphinx-config
sphinx-config: ## Build sphinx config
poetry run jupyter-book config sphinx notebooks
Expand Down
7 changes: 7 additions & 0 deletions dev/nbconvert
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
#!/usr/bin/env bash
set -e

for file in notebooks/**/*.ipynb
do
jupytext "$file" -s
done
7 changes: 7 additions & 0 deletions dev/nbsync
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
#!/usr/bin/env bash
set -e

for file in notebooks/**/*.md
do
jupytext "$file" -s
done
1 change: 1 addition & 0 deletions notebooks/_toc.yml
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@ parts:
- file: applications/overview
sections:
- file: applications/volatility_surface
- file: applications/hurst
- file: applications/calibration

- file: examples/overview
Expand Down
36 changes: 18 additions & 18 deletions notebooks/applications/calibration.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ jupytext:
extension: .md
format_name: myst
format_version: 0.13
jupytext_version: 1.14.7
jupytext_version: 1.16.6
kernelspec:
display_name: Python 3 (ipykernel)
language: python
Expand All @@ -26,7 +26,7 @@ Early pointers
For calibration we use {cite:p}`ukf`.
Lets consider the Heston model as a test case

```{code-cell} ipython3
```{code-cell}
from quantflow.sp.heston import Heston

pr = Heston.create(vol=0.6, kappa=1.3, sigma=0.8, rho=-0.6)
Expand Down Expand Up @@ -82,11 +82,11 @@ the state equation is given by
X_{t+1} &= \left[\begin{matrix}\kappa\left(\theta\right) dt \\ 0\end{matrix}\right] +
\end{align}

```{code-cell} ipython3
```{code-cell}
[p for p in pr.variance_process.parameters]
```

```{code-cell} ipython3
```{code-cell}

```

Expand All @@ -106,7 +106,7 @@ x_t &= \left[\begin{matrix}\nu_t && w_t && z_t\end{matrix}\right]^T \\
\bar{x}_t = {\mathbb E}\left[x_t\right] &= \left[\begin{matrix}\nu_t && 0 && 0\end{matrix}\right]^T
\end{align}

```{code-cell} ipython3
```{code-cell}
from quantflow.data.fmp import FMP
frequency = "1min"
async with FMP() as cli:
Expand All @@ -115,13 +115,13 @@ df = df.sort_values("date").reset_index(drop=True)
df
```

```{code-cell} ipython3
```{code-cell}
import plotly.express as px
fig = px.line(df, x="date", y="close", markers=True)
fig.show()
```

```{code-cell} ipython3
```{code-cell}
import numpy as np
from quantflow.utils.volatility import parkinson_estimator, GarchEstimator
df["returns"] = np.log(df["close"]) - np.log(df["open"])
Expand All @@ -132,43 +132,43 @@ fig = px.line(ds["returns"], markers=True)
fig.show()
```

```{code-cell} ipython3
```{code-cell}
import plotly.express as px
from quantflow.utils.bins import pdf
df = pdf(ds["returns"], num=20)
fig = px.bar(df, x="x", y="f")
fig.show()
```

```{code-cell} ipython3
```{code-cell}
g1 = GarchEstimator.returns(ds["returns"], dt)
g2 = GarchEstimator.pk(ds["returns"], ds["pk"], dt)
```

```{code-cell} ipython3
```{code-cell}
import pandas as pd
yf = pd.DataFrame(dict(returns=g2.y2, pk=g2.p))
fig = px.line(yf, markers=True)
fig.show()
```

```{code-cell} ipython3
```{code-cell}
r1 = g1.fit()
r1
```

```{code-cell} ipython3
```{code-cell}
r2 = g2.fit()
r2
```

```{code-cell} ipython3
```{code-cell}
sig2 = pd.DataFrame(dict(returns=np.sqrt(g2.filter(r1["params"])), pk=np.sqrt(g2.filter(r2["params"]))))
fig = px.line(sig2, markers=False, title="Stochastic volatility")
fig.show()
```

```{code-cell} ipython3
```{code-cell}
class HestonCalibration:

def __init__(self, dt: float, initial_std = 0.5):
Expand All @@ -186,19 +186,19 @@ class HestonCalibration:
return np.array(((1-self.kappa*self.dt, 0),(-0.5*self.dt, 0)))
```

```{code-cell} ipython3
```{code-cell}

```

```{code-cell} ipython3
```{code-cell}
c = HestonCalibration(dt)
c.x0
```

```{code-cell} ipython3
```{code-cell}
c.prediction(c.x0)
```

```{code-cell} ipython3
```{code-cell}
c.state_jacobian()
```
114 changes: 69 additions & 45 deletions notebooks/applications/hurst.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,67 +5,91 @@ jupytext:
format_name: myst
format_version: 0.13
jupytext_version: 1.16.6
kernelspec:
display_name: Python 3 (ipykernel)
language: python
name: python3
---

# Hurst Exponent

The [Hurst exponent](https://en.wikipedia.org/wiki/Hurst_exponent) is used as a measure of long-term memory of time series. It relates to the autocorrelations of the time series, and the rate at which these decrease as the lag between pairs of values increases.

It is a statistics which can be used to test if a time-series is mean reverting or it is trending.

```{code-cell} ipython3
from quantflow.sp.cir import CIR

p = CIR(kappa=1, sigma=1)
```

## Study the Weiner process OHLC

```{code-cell} ipython3
# %% [markdown]
# # Hurst Exponent
#
# The [Hurst exponent](https://en.wikipedia.org/wiki/Hurst_exponent) is used as a measure of long-term memory of time series. It relates to the autocorrelations of the time series, and the rate at which these decrease as the lag between pairs of values increases.
#
# It is a statistics which can be used to test if a time-series is mean reverting or it is trending.

# %% [markdown]
# ## Study with the Weiner Process
#
# We want to construct a mechanism to estimate the Hurst exponent via OHLC data because it is widely available from data provider and easily constructed as an online signal during trading.
#
# In order to evaluate results against known solutions, we consider the Weiner process as generator of timeseries.
#
# The Weiner process is a continuous-time stochastic process named in honor of Norbert Wiener. It is often also called Brownian motion due to its historical connection with the physical model of Brownian motion of particles in water, named after the botanist Robert Brown.

# %%
from quantflow.sp.weiner import WeinerProcess
from quantflow.utils.dates import start_of_day
p = WeinerProcess(sigma=0.5)
paths = p.sample(1, 1, 1000)
df = paths.as_datetime_df().reset_index()
paths = p.sample(1, 1, 24*60*60)
paths.plot()

# %%
df = paths.as_datetime_df(start=start_of_day()).reset_index()
df
```

```{code-cell} ipython3
# %% [markdown]
# ### Realized Variance
#
# At this point we estimate the standard deviation using the **realized variance** along the path (we use the **scaled** flag so that the standard deviation is scaled by the square-root of time step, in this way it removes the dependency on the time step size).
# The value should be close to the **sigma** of the WeinerProcess defined above.

# %%
float(paths.path_std(scaled=True)[0])

# %% [markdown]
# ### Range-base Variance estimators
#
# We now turn our attention to range-based volatility estimators. These estimators depends on OHLC timeseries, which are widely available from data providers such as [FMP](https://site.financialmodelingprep.com/).
# To analyze range-based variance estimators, we use he **quantflow.ta.OHLC** tool which allows to down-sample a timeserie to OHLC series and estimate variance with three different estimators
#
# * **Parkinson** (1980)
# * **Garman & Klass** (1980)
# * **Rogers & Satchell** (1991)
#
# See {cite:p}`molnar` for a detailed overview of the properties of range-based estimators.
#
# For this we build an OHLC estimator as template and use it to create OHLC estimators for different periods.

# %%
import pandas as pd
from quantflow.ta.ohlc import OHLC
from datetime import timedelta
ohlc = OHLC(serie="0", period="10m", rogers_satchell_variance=True, parkinson_variance=True, garman_klass_variance=True)
result = ohlc(df)
result
```

```{code-cell} ipython3

```

# Links

* [Wikipedia](https://en.wikipedia.org/wiki/Hurst_exponent)
* [Hurst Exponent for Algorithmic Trading
](https://robotwealth.com/demystifying-the-hurst-exponent-part-1/)

```{code-cell} ipython3
results = []
for period in ("2m", "5m", "10m", "30m", "1h", "4h"):
operator = ohlc.model_copy(update=dict(period=period))
result = operator(df).sum()
results.append(dict(period=period, pk=result["0_pk"].item(), gk=result["0_gk"].item(), rs=result["0_rs"].item()))
vdf = pd.DataFrame(results)
vdf

# %% [markdown]
# # Links
#
# * [Wikipedia](https://en.wikipedia.org/wiki/Hurst_exponent)
# * [Hurst Exponent for Algorithmic Trading
# ](https://robotwealth.com/demystifying-the-hurst-exponent-part-1/)

# %%
import pandas as pd
v = pd.to_timedelta(0.02, unit="d")
v
```

```{code-cell} ipython3
# %%
v.to_pytimedelta()
```

```{code-cell} ipython3
# %%
from quantflow.utils.dates import utcnow
pd.date_range(start=utcnow(), periods=10, freq="0.5S")
```

```{code-cell} ipython3
# %%
7*7+3*3

```
# %%
4 changes: 2 additions & 2 deletions notebooks/applications/overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ jupytext:
extension: .md
format_name: myst
format_version: 0.13
jupytext_version: 1.14.7
jupytext_version: 1.16.6
kernelspec:
display_name: Python 3 (ipykernel)
language: python
Expand All @@ -18,6 +18,6 @@ Real-world applications of the library
```{tableofcontents}
```

```{code-cell} ipython3
```{code-cell}

```
10 changes: 5 additions & 5 deletions notebooks/applications/sampling.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ jupytext:
extension: .md
format_name: myst
format_version: 0.13
jupytext_version: 1.14.7
jupytext_version: 1.16.6
kernelspec:
display_name: Python 3 (ipykernel)
language: python
Expand All @@ -15,21 +15,21 @@ kernelspec:

The library use the `Paths` class for managing monte carlo paths.

```{code-cell} ipython3
```{code-cell}
from quantflow.utils.paths import Paths

nv = Paths.normal_draws(paths=1000, time_horizon=1, time_steps=1000)
```

```{code-cell} ipython3
```{code-cell}
nv.var().mean()
```

```{code-cell} ipython3
```{code-cell}
nv = Paths.normal_draws(paths=1000, time_horizon=1, time_steps=1000, antithetic_variates=False)
nv.var().mean()
```

```{code-cell} ipython3
```{code-cell}

```
Loading
Loading