-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
branch: models_igr: graph display + record selection #6
base: master
Are you sure you want to change the base?
Conversation
…red); marking of true/predicted classes (record.py), randomization of starting weights (ebner_agent.py)
…utput in record.py"
neuronpp/utils/record.py
Outdated
|
||
# update data | ||
line.set_data(t, r) | ||
|
||
# info draw triangles for true and predicted classes |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should be parametrize from top -> plot() function, to set or no this drawing (as optional)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I shall an an option
neuronpp/utils/record.py
Outdated
|
||
# update data | ||
line.set_data(t, r) | ||
|
||
# info draw triangles for true and predicted classes | ||
true_x, true_y, pred_x, pred_y = self._class_tcks(label=i, true_class=true_class, pred_class=pred_class, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i is not a label. It is an enumerator of sections to records with the same variable name (eg. v)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, this is an error here, or rather it is used in assumption that labels are 0, 1, 2, ... They need not to be. I shall pass a table of class_names as parameter, consistent with enumeration, and then see line in line 177 we shall have
if true_class[k] == class_names[label], and similarly in 179: if pred_class == true_classes[label]
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i
represents number of segment saved to be recorded. If you want to record from 5 neurons - 2 input, 3 output in the same Record
object - you can. So in this case i
won't represent any true values, but rather enumerator.
I think you should add parameter to the Record
object to show predicted/true values, but it must be done explicitely as optional param: in the constructor or in the plotting method. Because otherwise it is extremelly error prone.
neuronpp/utils/record.py
Outdated
fig.canvas.draw() | ||
fig.canvas.flush_events() | ||
|
||
if create_fig: | ||
plt.show(block=False) | ||
|
||
def _class_tcks(self, label, true_class, pred_class, t, r, stepsize, dt): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
write full name of the function
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok, how about
def _class_tcks(self, label, true_class, pred_class, t, r, stepsize, dt): | |
def _draw_true_predicted_class_marks(self, label, true_class, pred_class, t, r, stepsize, dt): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it works for me
neuronpp/utils/record.py
Outdated
ax.set_title("Variable: %s" % var_name) | ||
ax.plot(self.t, rec, label=name) | ||
ax.set(xlabel='t (ms)', ylabel=var_name) | ||
ax.legend() | ||
|
||
def _plot_animate(self, steps=10000, y_lim=None, position=None): | ||
def _plot_animate(self, steps=10000, y_lim=None, position=None, true_class=None, pred_class=None, stepsize=None, | ||
dt=None): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If you add new params to the method with docstring - please add docstring as well for them
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok
|
||
if 'inh' in c.name: | ||
self.colors.append('red') | ||
y_pos -= 5 | ||
# todo center vertically by half width of the hid layer | ||
y_pos -= 6 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This was original idea how to present plot but now I think it was bad idea to manually shift it like this. But it maybe as it is
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Problem w linii 95 z rzeczywistym przesunięciem polega na znalezieniu nazwy warstwy poprzedzającej. Jeśli jest nią 'hid_1' to wtedy może być przez
y_pos -= 6 | |
y_pos -= self.population_sizes['hid_1'] // 2 + self.population_sizes[pop_name] |
neuronpp/utils/record.py
Outdated
true_y = [min(r) + 2] * len(true_x) | ||
pred_y = [max(r) - 2] * len(pred_x) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Need to find better true_y (position of lower triangle) and red_y (position of upper one). All goes well as long as the motor neurone are activated.
neuronpp/utils/record.py
Outdated
fig.canvas.draw() | ||
fig.canvas.flush_events() | ||
|
||
if create_fig: | ||
plt.show(block=False) | ||
|
||
def _class_tcks(self, label, true_class, pred_class, t, r, stepsize, dt): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it works for me
neuronpp/utils/record.py
Outdated
|
||
# update data | ||
line.set_data(t, r) | ||
|
||
# info draw triangles for true and predicted classes | ||
true_x, true_y, pred_x, pred_y = self._class_tcks(label=i, true_class=true_class, pred_class=pred_class, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i
represents number of segment saved to be recorded. If you want to record from 5 neurons - 2 input, 3 output in the same Record
object - you can. So in this case i
won't represent any true values, but rather enumerator.
I think you should add parameter to the Record
object to show predicted/true values, but it must be done explicitely as optional param: in the constructor or in the plotting method. Because otherwise it is extremelly error prone.
…class labels parameter
# Conflicts: # agents/agent.py # agents/ebner_agent.py
neuronpp/utils/record.py
Outdated
if true_labels is not None: | ||
true_x, pred_x = self._true_predicted_class_marks(label=true_labels[i], true_class=true_class, | ||
pred_class=pred_class, t=t, r=r, | ||
stepsize=stepsize, dt=dt) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There is no r param in the method self._true_predicted_class_marks()
neuronpp/utils/record.py
Outdated
raise ValueError("True_labels parameter need to be given if show_true_prediction is True") | ||
if y_lim is None: | ||
true_y = [y_limits[0] + np.abs(y_limits[0]) * 0.09] * len(true_x) | ||
pred_y = [y_limits[1] - np.abs(y_limits[1] * 0.11)] * len(pred_x) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
variables y_lim
and y_limits
are too confusing. It should be a single variable. I think you should override y_lim
in line 154.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actually no, because the parameter y_lim is global for all plots (a tuple in parameters); while y_limits are compoted per individual plots. During the course of simulation, the activations tend to have different values.
Changed the name y_limits to this_plot_y_limits.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actually this shall never work as it was before and is now: if the max--min values come too near, the y_lim plots is much different from what we can read from data. There must be a method which shall first compute the wanted limits, then set them, then read what matplotlib really used. Is there one?
:param pred_class: list of predicted class labels in window | ||
:param stepsize: agent readout time step | ||
:param dt: agent integration time step | ||
:param show_true_predicted: whther to print true/predicted class' marks on the plot |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think there is no point of using show_true_predicted
param, because you can just check whether true_class
and pred_class
is not None.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a matter of liking. Lots of code is written so that switch some behaviour on/off using a single switch.
But this is an important comment: I shall move out the binary show_true_predicted out of the run_params tuple. Even so as this is not actually a simulation parameter.
neuronpp/utils/record.py
Outdated
|
||
# update data | ||
line.set_data(t, r) | ||
|
||
if show_true_predicted: | ||
# info draw triangles for true and predicted classes |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This whole part should be moved to the separated method. The method _plot_animate()
is already too big.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
THe owner is my god
@@ -1,11 +1,9 @@ | |||
from nrn import Segment, Section |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this concept of recording with labels should be moved to separated class which inherits from Record
, eg. RecordWithLabels
. In this case - dt
and true_labels
params from the _plot_animate()
metod can be moved into consructor.
# Conflicts: # neuronpp/utils/record.py
# Conflicts: # neuronpp/utils/record.py
CORRECTED all comments, still some very minor problems arise with single edges drawn incorrectly
UPDATES to record.py