diff --git a/_config.yml b/_config.yml
index c367fef..db9d8d5 100644
--- a/_config.yml
+++ b/_config.yml
@@ -2,10 +2,10 @@ title: Metafacture Tutorial
description: This is a tutorial to Metafacture.
theme: just-the-docs
-url: https://metafacture.github.io/metafacture-documentation
+url: https://metafacture.github.io/metafacture-tutorial
aux_links:
- Metafacture Documentation on Github: https://github.com/metafacture/metafacture-tutorial
+ Metafacture Tutorial on Github: https://github.com/metafacture/metafacture-tutorial
# External navigation links
nav_external_links:
diff --git a/docs/02_Introduction_into_Metafacture-Flux.md b/docs/02_Introduction_into_Metafacture-Flux.md
index 552c470..99fcd09 100644
--- a/docs/02_Introduction_into_Metafacture-Flux.md
+++ b/docs/02_Introduction_into_Metafacture-Flux.md
@@ -38,7 +38,7 @@ See the result below? It is `Hello, friend. I'am Metafacture!`.
But what have we done here?
We have a short text string `"Hello, friend. I'am Metafacture"`. That is printed with the modul `print`.
-A Metafacture Workflow is nothing else than an incoming text string that is manipulated by one or multiple moduls that do something with the incoming string.
+A Metafacture Workflow is nothing else than an incoming text string that is manipulated by one or multiple modules that do something with the incoming string.
However, the workflow does not have to start with a text string but can also be a variable that stands for the text string and needs to be defined before the workflow. As this:
```text
@@ -93,8 +93,7 @@ inputFile
```
The inputFile is opened as a file (`open-file`) and then processed line by line (`as-line`).
-You can see that in this [sample](https://metafacture.org/playground/?flux=inputFile%0A%7Copen-file%0A%7Cas-lines%0A%7Cprint%0A%3B&data=Hello%2C+friend.+I%27am+Metafacture%21).
-https://metafacture.org/playground/?flux=inputFile%0A%7C+open-file%0A%7C+as-lines%0A%7C+print%0A%3B&data=Hello%2C+friend.+I%27am+Metafacture%21
+Have a look at this [sample](https://metafacture.org/playground/?flux=inputFile%0A%7Copen-file%0A%7Cas-lines%0A%7Cprint%0A%3B&data=Hello%2C+friend.+I%27am+Metafacture%21).
We usually do not start with any random text strings but with data. So lets play around with some data.
@@ -108,7 +107,7 @@ You will see data that look like this:
This is data in JSON format. But it seems not very readable.
-But all these fields tell something about a publication, a book, with 268 pages and title Ordinary Vices by Judith N. Shklar.
+All these fields tell us something about a publication, a book, with 268 pages and title "Ordinary Vices" by Judith N. Shklar.
Let's copy the JSON data into our `ìnputFile-content` field. [And run it again](https://metafacture.org/playground/?flux=inputFile%0A%7Copen-file%0A%7Cas-lines%0A%7Cprint%0A%3B&data=%7B%22publishers%22%3A+%5B%22Belknap+Press+of+Harvard+University+Press%22%5D%2C+%22identifiers%22%3A+%7B%22librarything%22%3A+%5B%22321843%22%5D%2C+%22goodreads%22%3A+%5B%222439014%22%5D%7D%2C+%22covers%22%3A+%5B413726%5D%2C+%22local_id%22%3A+%5B%22urn%3Atrent%3A0116301499939%22%2C+%22urn%3Asfpl%3A31223009984353%22%2C+%22urn%3Asfpl%3A31223011345064%22%2C+%22urn%3Acst%3A10017055762%22%5D%2C+%22lc_classifications%22%3A+%5B%22JA79+.S44+1984%22%2C+%22HM216+.S44%22%2C+%22JA79.S44+1984%22%5D%2C+%22key%22%3A+%22/books/OL2838758M%22%2C+%22authors%22%3A+%5B%7B%22key%22%3A+%22/authors/OL381196A%22%7D%5D%2C+%22ocaid%22%3A+%22ordinaryvices0000shkl%22%2C+%22publish_places%22%3A+%5B%22Cambridge%2C+Mass%22%5D%2C+%22subjects%22%3A+%5B%22Political+ethics.%22%2C+%22Liberalism.%22%2C+%22Vices.%22%5D%2C+%22pagination%22%3A+%22268+p.+%3B%22%2C+%22source_records%22%3A+%5B%22marc%3AOpenLibraries-Trent-MARCs/tier5.mrc%3A4020092%3A744%22%2C+%22marc%3Amarc_openlibraries_sanfranciscopubliclibrary/sfpl_chq_2018_12_24_run01.mrc%3A195791766%3A1651%22%2C+%22ia%3Aordinaryvices0000shkl%22%2C+%22marc%3Amarc_claremont_school_theology/CSTMARC1_barcode.mrc%3A137174387%3A3955%22%2C+%22bwb%3A9780674641754%22%2C+%22marc%3Amarc_loc_2016/BooksAll.2016.part15.utf8%3A115755952%3A680%22%2C+%22marc%3Amarc_claremont_school_theology/CSTMARC1_multibarcode.mrc%3A137367696%3A3955%22%2C+%22ia%3Aordinaryvices0000shkl_a5g0%22%2C+%22marc%3Amarc_columbia/Columbia-extract-20221130-001.mrc%3A328870555%3A1311%22%2C+%22marc%3Aharvard_bibliographic_metadata/ab.bib.01.20150123.full.mrc%3A156768969%3A815%22%5D%2C+%22title%22%3A+%22Ordinary+vices%22%2C+%22dewey_decimal_class%22%3A+%5B%22172%22%5D%2C+%22notes%22%3A+%7B%22type%22%3A+%22/type/text%22%2C+%22value%22%3A+%22Bibliography%3A+p.+251-260.\nIncludes+index.%22%7D%2C+%22number_of_pages%22%3A+268%2C+%22languages%22%3A+%5B%7B%22key%22%3A+%22/languages/eng%22%7D%5D%2C+%22lccn%22%3A+%5B%2284000531%22%5D%2C+%22isbn_10%22%3A+%5B%220674641752%22%5D%2C+%22publish_date%22%3A+%221984%22%2C+%22publish_country%22%3A+%22mau%22%2C+%22by_statement%22%3A+%22Judith+N.+Shklar.%22%2C+%22works%22%3A+%5B%7B%22key%22%3A+%22/works/OL2617047W%22%7D%5D%2C+%22type%22%3A+%7B%22key%22%3A+%22/type/edition%22%7D%2C+%22oclc_numbers%22%3A+%5B%2210348450%22%5D%2C+%22latest_revision%22%3A+16%2C+%22revision%22%3A+16%2C+%22created%22%3A+%7B%22type%22%3A+%22/type/datetime%22%2C+%22value%22%3A+%222008-04-01T03%3A28%3A50.625462%22%7D%2C+%22last_modified%22%3A+%7B%22type%22%3A+%22/type/datetime%22%2C+%22value%22%3A+%222024-12-27T16%3A46%3A50.181109%22%7D%7D).
@@ -117,14 +116,12 @@ The output in result is the same as the input and it is still not very readable.
Lets turn the one line of JSON data into YAML. YAML is another format for structured information which is a bit easier to read for human eyes.
In order to change the serialization of the data we need to decode the data and then encode the data.
-Metafacture has lots of decoder and encoder modules for specific data formats that can be used in an Flux workflow.
+Metafacture has lots of decoder and encoder modules for specific data formats that can be used in a Flux workflow.
Let's try this out. Add the module `decode-json` and `encode-yaml` to your Flux Workflow.
The Flux should now look like this:
-Flux:
-
```text
inputFile
| open-file
@@ -217,7 +214,7 @@ Luckily, we cannot only open the data we have in our `inputFile-content` field,
Clear your playground and copy the following Flux workflow:
-```
+```text
"https://openlibrary.org/books/OL2838758M.json"
| open-http
| as-lines
@@ -227,22 +224,24 @@ Clear your playground and copy the following Flux workflow:
;
```
-The [result in the playground](https://metafacture.org/playground/?flux=%22https%3A//openlibrary.org/books/OL2838758M.json%22%0A%7C+open-http%0A%7C+as-lines%0A%7C+decode-json%0A%7C+encode-yaml%0A%7C+print%0A%3B) should be the same as before without having to paste anything into the text field. We just used the module `open-http` and directly retrieved the data from the URL.
+The [result in the playground](https://metafacture.org/playground/?flux=%22https%3A//openlibrary.org/books/OL2838758M.json%22%0A%7C+open-http%0A%7C+as-lines%0A%7C+decode-json%0A%7C+encode-yaml%0A%7C+print%0A%3B) should be the same as before without having to paste anything into the text field. We just used the module `open-http` to directly retrieve the data from the URL.
-Let's take a look what a Flux workflow does. The Flux workflow is combination of different moduls to process incoming structured data. In our example we have different things that we do with these modules:
+Let's take a look at what a Flux workflow does. The Flux workflow is a combination of different modules to process incoming structured data. In our example we have different things that we do with these modules:
1. We have a URL as input. The URL localizes the data on the web.
-2. We tell Metafacture to request the stated url using `open-http`.
+2. We tell Metafacture to request the stated URL using `open-http`.
3. Then we define how to handle the incoming data: since the JSON is written in one line, we tell Metafacture to regard every new line as a new record with `as-lines`
-4. Afterwards we tell Metafacture to `decode-json` in order to translate the incoming data as json to the generic internal data model that is called metadata events
+4. Afterwards we tell Metafacture to `decode-json` in order to translate the incoming data as JSON to the generic internal data model that is called metadata events
5. Then we instruct Metafacture to serialize the metadata events as YAML with `encode-yaml`
6. Finally, we tell MF to `print` everything.
-So let's have a small recap of what we done and learned so far: * We played around with the Metafacture Playground.
-* We learned that a Metafacture Flux workflow is a combination of modules with an inital text string or an variable.
+So let's have a small recap of what we've done and learned so far:
+
+* We've played around with the Metafacture Playground.
+* We've learned that a Metafacture Flux workflow is a combination of modules with an inital text string or a variable.
* We got to know different modules like `open-http`, `as-lines`. `decode-json`, `encode-yaml`, `print`
-More modules can be found in the [documentation of available flux commands](https://github.com/metafacture/metafacture-documentation/blob/master/flux-commands.html).
+More modules can be found in the [documentation of available flux commands](https://metafacture.github.io/metafacture-documentation/docs/flux/flux-commands.html).
Now take some time and play around a little bit more and use some other modules.
@@ -268,16 +267,16 @@ Now take some time and play around a little bit more and use some other modules.
What you see with the modules `encode-formeta` and `write` is that modules can have further specification in brackets.
These can eiter be a string in `"..."` or attributes that define options as with `style=`.
-One last thing you should learn on an abstract level is to grasp the general idea of Metafacture Flux workflows is that they have many different moduls through which the data is flowing.
-The most abstract and most common process resemble the following steps:
+One last thing you should learn on an abstract level to grasp the general idea of Metafacture Flux workflows is that they have many different modules through which the data is flowing.
+The most abstract and most common process resembles the following steps:
**→ read → decode → transform → encode → write →**
-This process is one that transforms incoming data in a way that is changed at the end.
+This process chain transforms incoming data in distinct steps.
Each step can be done by one or a combination of multiple modules.
Modules are small tools that do parts of the complete task we want to do.
-Each modul demands a certain input and give a certain output. This is called signature.
+Each modul demands a certain input and gives a certain output. This is called signature.
e.g.:
The first modul `open-file` expects a string and provides read data (called reader).
@@ -286,12 +285,12 @@ This reader data can be passed on to a modul that accepts reader data e.g. in ou
If you have a look at the flux modul/command documentation then you see under signature which data a modul expects and which data it outputs.
-The combination of moduls is a Flux workflow.
+The combination of modules is called a "Flux workflow".
Each module is separated by a `|` and every workflow ends with a `;`.
Comments can be added with `//`.
-See:
+For example:
```
//input string:
@@ -319,7 +318,7 @@ Add the option: prettyPrinting="true"
to the encode-json
Answer
@@ -329,7 +328,7 @@ The signature of decode-xml
and decode-json
is quiet d
decode-json
: signature: String -> StreamReceiver
Explanation:
-decode-xml
expects data from Reader output of open-file
or open-http
, and creates output that can be transformed by a specific xml handler
. The xml parser of decode-xml
works straight with read content of a file or a url.
+decode-xml
expects data from Reader output of open-file
or open-http
, and creates output that can be transformed by a specific XML handler
. The XML parser of decode-xml
works straight by reading the content of a file or a URL.
decode-json
expects data from output of a string like as-lines
or as-records
and creates output that could be transformed by fix
or encoded with a module like encode-xml
. For the most decoding you have to specify how (as-lines
or as-records
) the incoming data is read.
@@ -354,7 +353,7 @@ Explanation:
As you surely already saw I mentioned transform as one step in a metafacture workflow.
-But aside from changing the serialisation we did not play around with transformations yet.
+But aside from changing the serialization we did not play around with transformations yet.
This will be the theme of the next session.
---------------
diff --git a/docs/03_Introduction_into_Metafacture-Fix.md b/docs/03_Introduction_into_Metafacture-Fix.md
index 5083437..96da4d7 100644
--- a/docs/03_Introduction_into_Metafacture-Fix.md
+++ b/docs/03_Introduction_into_Metafacture-Fix.md
@@ -7,15 +7,15 @@ parent: Tutorial
# Lesson 3: Introduction into Metafacture Fix
-In the last session we learned about Flux moduls.
-Flux moduls can do a lot of things. They configure the "high-level" transformation pipeline.
+In the last session we've learned about Flux modules.
+Flux modules can do a lot of things. They configure the "high-level" transformation pipeline.
-But the main transformation of incoming data at record, elemenet and value level is usually done by the transformation moduls [Fix](https://metafacture.github.io/metafacture-documentation/docs/flux/flux-commands.html#fix) or [Morph](https://metafacture.github.io/metafacture-documentation/docs/flux/flux-commands.html#morph) as one step in the pipeline.
+But the main transformation of incoming data at record, element and value level is usually done by the transformation modules [Fix](https://metafacture.github.io/metafacture-documentation/docs/flux/flux-commands.html#fix) or [Morph](https://metafacture.github.io/metafacture-documentation/docs/flux/flux-commands.html#morph) as one step in the pipeline.
By transformation we mean things like:
* Manipulating element names and element values
-* Change hierachies and structures of records
+* Changing hierachies and structures of records
* Lookup values in concordance list
But not changing serialization that is part of encoding and decoding.
@@ -47,10 +47,10 @@ You should end up with something like:
title: "Ordinary vices"
```
-The Fix module, called by `fix`, in Metafacture is used to manipulate the input data filtering fields we would like to see. Only one fix-function was used: `retain`, which throws away all the data from the input except the stated `"title"` field. Normally all incoming data is passed through, unless it is somehow manipulated or a `retain` function is used.
+The Fix module, called by `fix`, is used to manipulate the input data filtering fields we would like to see. Only one Fix-function was used: `retain`, which throws away all the data from the input except the stated `"title"` field. Normally all incoming data is passed through, unless it is somehow manipulated or a `retain` function is used.
-HINT: As long as you embedd the fix functions in the Flux Workflow, you have to use double quotes to fence the fix functions,
-and single quotes in the fix functions. As we did here: `fix ("retain('title')")`
+HINT: As long as you embed the Fix functions in the Flux Workflow, you have to use double quotes to fence the Fix functions,
+and single quotes in the Fix functions. As we did here: `fix ("retain('title')")`
Now let us additionally keep the info that is given in the element `"publish_date"` and the subfield `"key"` in `'type'` by adding `'publish_date', 'type.key'` to `retain`:
@@ -76,9 +76,9 @@ notes:
```
-When manipulating data you often need to create many fixes to process a data file in the format and structure you need. With a text editor you can write all fix functions in a singe separate Fix file.
+When manipulating data you often need to create many Fixes to process a data file in the format and structure you need. With a text editor you can write all Fix functions in a singe separate Fix file.
-The playground has an transformationFile-content area that can be used as if the Fix is in a separate file.
+The playground has a transformationFile-content area that can be used as if the Fix is in a separate file.
In the playground we use the variable `transformationFile` to adress the Fix file in the playground.
Like this.
@@ -93,16 +93,16 @@ retain("title", "publish_date", "notes.value", "type.key")
Using a separate Fix file is recommended if you need to write many Fix functions. It will keep the Flux workflow clear and legible.
-To add more fixes we can again edit the Fix file.
+To add more Fixes we can again edit the Fix file.
Lets add these lines in front of the retain function:
-```
+```perl
move_field("type.key", "pub_type")
```
Also change the `retain` function so that you keep the new element `"pub_type"` instead of the not existing nested `"key"` element.
-```
+```perl
move_field("type.key","pub_type")
retain("title", "publish_date", "notes.value", "pub_type")
```
@@ -121,7 +121,7 @@ notes:
With `move_field` we moved and renamed an existing element.
As next step add the following function before the `retain` function.
-```
+```perl
replace_all("pub_type","/type/","")
```
@@ -169,7 +169,7 @@ retain("title", "publish_date", "pub_type")
2) [Add a field with todays date called `"map_date"`.](https://metafacture.org/playground/?flux=%22https%3A//openlibrary.org/books/OL2838758M.json%22%0A%7C+open-http%0A%7C+as-lines%0A%7C+decode-json%0A%7C+fix+%28transformationFile%29%0A%7C+encode-yaml%0A%7C+print%0A%3B&transformation=move_field%28%22type.key%22%2C%22pub_type%22%29%0Areplace_all%28%22pub_type%22%2C%22/type/%22%2C%22%22%29%0A...%28%22mape_date%22%2C%22...%22%29%0Aretain%28%22title%22%2C+%22publish_date%22%2C+%22by_statement%22%2C+%22pub_type%22%29)
-Have a look at the fix functions: https://metafacture.org/metafacture-documentation/docs/fix/Fix-functions.html (Hint: you could use `add_field` or `timestamp`. And don't forget to add the new element to `retain`)
+Have a look at the [Fix functions](https://metafacture.org/metafacture-documentation/docs/fix/Fix-functions.html). (Hint: you could use `add_field` or `timestamp`. And don't forget to add the new element to `retain`)
diff --git a/docs/04_Fix-Path.md b/docs/04_Fix-Path.md
index c065b3c..d8f1ceb 100644
--- a/docs/04_Fix-Path.md
+++ b/docs/04_Fix-Path.md
@@ -79,9 +79,9 @@ x:
c: Hello :-)
```
-Then you would point to the c field with this path: `x.y.z.a.b.c`.
+you would point to the c field with this path: `x.y.z.a.b.c`.
-So lets do some simple excercises:
+So let's do some simple excercises:
[Try and complete the fix functions. Transform the element `a` into `title` and combine the subfields of `b` and `c` to the element `author`.](https://metafacture.org/playground/?flux=inputFile%0A%7Copen-file%0A%7Cas-records%0A%7Cdecode-yaml%0A%7Cfix%28transformationFile%29%0A%7Cencode-json%28prettyPrinting%3D%22true%22%29%0A%7Cprint%0A%3B&transformation=move_field%28%22a%22%2C+%22title%22%29%0Apaste%28%22author%22%2C+%22...%22%2C+...%2C+%22~from%22%2C+...%29%0Aretain%28%22title%22%2C+%22author%22%29&data=---%0Aa%3A+Faust%0Ab+%3A%0A++ln%3A+Goethe%0A++fn%3A+JW%0Ac%3A+Weimar%0A%0A---%0Aa%3A+R%C3%A4uber%0Ab+%3A%0A++++ln%3A+Schiller%0A++++fn%3A+F%0Ac%3A+Weimar)
@@ -99,7 +99,7 @@ There are two extra path structures that need to be explained:
For both repeated fields and arrays you need to use an **index** to select an element.
-For YAML and JSON-arrays specifically you also need to use an **array marker** that are generated by the YAML- and JSON- decoders can be interpreted by the encoders for YAML and JSON.
+For YAML and JSON-arrays specifically you also need **array markers** which are generated by the YAML- and JSON- decoders and which are interpreted by the encoders for YAML and JSON.
### Working with repeated fields
@@ -113,7 +113,7 @@ creator: Bob
To point to one of the `creator` elements you need to use an index. The first index has value 1, the second the value 2, the third the value 3. So, the path of the creator Bob would be `creator.3`. (In contrast, Catmandu uses an zero based index starting with 0 as the first index.)
-If you want to refer to all creators then you can use the `*` sign as a wildcard: `creator.*` refers to all creator elements. The first instance can be selected by the `$first` wildcard and the last by `$last`. This is espacially handy if you do not know how often an element is repeated. When adding an additional repeated element you can use the `$append` or `$prepend` wildcards.
+If you want to refer to all creators then you can use the `*` sign as a wildcard: `creator.*` refers to all creator elements. The first instance can be selected by the `$first` wildcard and the last by `$last`. This is especially handy if you do not know how often an element is repeated. When adding an additional repeated element you can use the `$append` or `$prepend` wildcards.
[`append` the correct last name to the three investigators: Justus Jonas, Peter Shaw and Bob Andrews. Also `prepend` "Investigator" to all of them.](https://metafacture.org/playground/?flux=inputFile%0A%7Copen-file%0A%7Cas-records%0A%7Cdecode-yaml%0A%7Cfix%28transformationFile%29%0A%7Cencode-json%28prettyPrinting%3D%22true%22%29%0A%7Cprint%0A%3B&transformation=&data=---%0Acreator%3A+Justus%0Acreator%3A+Peter%0Acreator%3A+Bob%0A)
@@ -131,12 +131,12 @@ prepend("creator.*","Investigator ")
-Hint: Sometimes a repeatable field only can appear only once or not at all. If the record only provide the element once Metafacture (as Catmandu does as well) interpretes the single appearance of an field not a list but as a simple field or an object. You have to adjust your transformation to meet both scenarios. One way how to deal with this is the list bind, which is agnostic to how often an element is provided. The list bind will be introduced in the next session 05.
+Hint: Sometimes a repeatable field can only appear once or not at all. If the record only provides the element once Metafacture (as Catmandu does as well) interprets the single appearance of a field not as a list but as a simple field or an object. You have to adjust your transformation to meet both scenarios. One way how to deal with this is the list bind, which is agnostic to how often an element is provided. The list bind will be introduced in the next session 05.
### Working with JSON and YAML arrays
-In JSON or YAML element repetion is possible but unusual. Instead of repeating elements an element can have a list or array of values.
+In JSON or YAML element repetition is possible but unusual. Instead of repeating elements an element can have a list or array of values.
In our book example we have an array as value:
@@ -152,9 +152,9 @@ Our example from above would look like this if creator was a list instead of a r
```yaml
creator:
- - Justus
- - Peter
- - Bob
+ - Justus
+ - Peter
+ - Bob
```
Lists can be deeply nested if the values are not just strings (list of strings) but objects (list of objects):
@@ -185,7 +185,7 @@ So, the path of the `red` would be: `my.colors[].2`
And the path for `Peter` would be `characters[].2.name`
-Also if you want to generate an array in the target format JSON or YAML, then you need to add `[]` at the end of an list element like `newArray[]`.
+If you want to generate an array in the target format JSON or YAML, then you need to add `[]` at the end of an list element like `newArray[]`.
## Excercise:
@@ -209,9 +209,9 @@ Especially when working with complex bibliographic data one has to get to know t
There exists multiple ways to find out the path-names of records. Two examples:
-1) [Here a way to show pathways in combination with values.](https://metafacture.org/playground/?flux=inputFile%0A%7C+open-file%0A%7C+as-lines%0A%7C+decode-pica%0A%7C+fix%28%22nothing%28%29%22%2C+repeatedFieldsToEntities+%3D+%22true%22%29%0A%7C+flatten%0A%7C+encode-literals%0A%7C+print%0A%3B&data=001@+%1Fa5%1F01-2%1E001A+%1F01100%3A15-10-94%1E001B+%1F09999%3A12-06-06%1Ft16%3A10%3A17.000%1E001D+%1F09999%3A99-99-99%1E001U+%1F0utf8%1E001X+%1F00%1E002@+%1F0Aag%1E003@+%1F0482147350%1E006U+%1F094%2CP05%1E007E+%1F0U+70.16407%1E007I+%1FSo%1F074057548%1E011@+%1Fa1970%1E017A+%1Farh%1E021A+%1FaDie+@Berufsfreiheit+der+Arbeitnehmer+und+ihre+Ausgestaltung+in+vo%CC%88lkerrechtlichen+Vertra%CC%88gen%1FdEine+Grundrechtsbetrachtg%1E028A+%1F9106884905%1F7Tn3%1FAgnd%1F0106884905%1FaProjahn%1FdHorst+D.%1E033A+%1FpWu%CC%88rzburg%1E034D+%1FaXXXVIII%2C+165+S.%1E034I+%1Fa8%1E037C+%1FaWu%CC%88rzburg%2C+Jur.+F.%2C+Diss.+v.+7.+Aug.+1970%1E%0A001@+%1F01%1Fa5%1E001A+%1F01140%3A08-12-99%1E001B+%1F09999%3A05-01-08%1Ft22%3A57%3A29.000%1E001D+%1F09999%3A99-99-99%1E001U+%1F0utf8%1E001X+%1F00%1E002@+%1F0Aa%1E003@+%1F0958090564%1E004A+%1Ffkart.+%3A+DM+9.70%2C+EUR+4.94%2C+sfr+8.00%2C+S+68.00%1E006U+%1F000%2CB05%2C0285%1E007I+%1FSo%1F076088278%1E011@+%1Fa1999%1E017A+%1Farb%1Fasi%1E019@+%1FaXA-AT%1E021A+%1FaZukunft+Bildung%1FhPolitische+Akademie.+%5BHrsg.+von+Gu%CC%88nther+R.+Burkert-Dottolo+und+Bernhard+Moser%5D%1E028C+%1F9130681849%1F7Tp1%1FVpiz%1FAgnd%1F0130681849%1FE1952%1FaBurkert%1FdGu%CC%88nther+R.%1FBHrsg.%1E033A+%1FpWien%1FnPolit.+Akad.%1E034D+%1Fa79+S.%1E034I+%1Fa24+cm%1E036F+%1Fx299+12%1F9551720077%1FgAdn%1F7Tb1%1FAgnd%1F01040469-7%1FaPolitische+Akademie%1FgWien%1FYPA-Information%1FhPolitische+Akademie%2C+WB%1FpWien%1FJPolitische+Akad.%2C+WB%1Fl99%2C2%1E036F/01+%1Fx12%1F9025841467%1FgAdvz%1Fi2142105-5%1FYAktuelle+Fragen+der+Politik%1FhPolitische+Akademie%1FpWien%1FJPolitische+Akad.+der+O%CC%88VP%1FlBd.+2%1E045E+%1Fa22%1Fd18%1Fm370%1E047A+%1FSFE%1Fata%1E%0A001@+%1Fa5%1F01%1E001A+%1F01140%3A19-02-03%1E001B+%1F09999%3A19-06-11%1Ft01%3A20%3A13.000%1E001D+%1F09999%3A26-04-03%1E001U+%1F0utf8%1E001X+%1F00%1E002@+%1F0Aal%1E003@+%1F0361809549%1E004A+%1FfHlw.%1E006U+%1F000%2CL01%1E006U+%1F004%2CP01-s-41%1E006U+%1F004%2CP01-f-21%1E007G+%1FaDNB%1F0361809549%1E007I+%1FSo%1F072658383%1E007M+%1F04413/0275%1E011@+%1Fa1925%1E019@+%1FaXA-DXDE%1FaXA-DE%1E021A+%1FaHundert+Jahre+Buchdrucker-Innung+Hamburg%1FdWesen+u.+Werden+d.+Vereinigungen+Hamburger+Buchdruckereibesitzer+1825-1925+%3B+Gedenkschrift+zur+100.+Wiederkehr+d.+Gru%CC%88ndungstages%2C+verf.+im+Auftr.+d.+Vorstandes+d.+Buchdrucker-Innung+%28Freie+Innung%29+zu+Hamburg%1FhFriedrich+Voeltzer%1E028A+%1F9101386281%1F7Tp1%1FVpiz%1FAgnd%1F0101386281%1FE1895%1FaVo%CC%88ltzer%1FdFriedrich%1E033A+%1FpHamburg%1FnBuchdrucker-Innung+%28Freie+Innung%29%1E033A+%1FpHamburg%1Fn%5BVerlagsbuchh.+Broschek+%26+Co.%5D%1E034D+%1Fa44+S.%1E034I+%1Fa4%1E%0A001@+%1Fa5%1F01-3%1E001A+%1F01240%3A01-08-95%1E001B+%1F09999%3A24-09-10%1Ft17%3A42%3A20.000%1E001D+%1F09999%3A99-99-99%1E001U+%1F0utf8%1E001X+%1F00%1E002@+%1F0Af%1E003@+%1F0945184085%1E004A+%1F03-89007-044-2%1FfGewebe+%3A+DM+198.00%2C+sfr+198.00%2C+S+1386.00%1E006T+%1F095%2CN35%2C0856%1E006U+%1F095%2CA48%2C1186%1E006U+%1F010%2CP01%1E007I+%1FSo%1F061975997%1E011@+%1Fa1995%1E017A+%1Fara%1E021A+%1Fx213%1F9550711899%1FYNeues+Handbuch+der+Musikwissenschaft%1Fhhrsg.+von+Carl+Dahlhaus.+Fortgef.+von+Hermann+Danuser%1FpLaaber%1FJLaaber-Verl.%1FS48%1F03-89007-030-2%1FgAc%1E021B+%1FlBd.+13.%1FaRegister%1Fhzsgest.+von+Hans-Joachim+Hinrichsen%1E028C+%1F9121445453%1F7Tp3%1FVpiz%1FAgnd%1F0121445453%1FE1952%1FaHinrichsen%1FdHans-Joachim%1E034D+%1FaVIII%2C+408+S.%1E045V+%1F9090001001%1E047A+%1FSFE%1Fagb/fm%1E%0A001@+%1F01-2%1Fa5%1E001A+%1F01239%3A18-08-11%1E001B+%1F09999%3A05-09-11%1Ft23%3A31%3A44.000%1E001D+%1F01240%3A30-08-11%1E001U+%1F0utf8%1E001X+%1F00%1E002@+%1F0Af%1E003@+%1F01014417392%1E004A+%1Ffkart.%1E006U+%1F011%2CA37%1E007G+%1FaDNB%1F01014417392%1E007I+%1FSo%1F0752937239%1E010@+%1Fager%1E011@+%1Fa2011%1E017A+%1Fara%1Fasf%1E021A+%1Fxtr%1F91014809657%1F7Tp3%1FVpiz%1FAgnd%1F01034622773%1FE1958%1FaLu%CC%88beck%1FdMonika%1FYPersonalwirtschaft+mit+DATEV%1FhMonika+Lu%CC%88beck+%3B+Helmut+Lu%CC%88beck%1FpBodenheim%1FpWien%1FJHerdt%1FRXA-DE%1FS650%1FgAc%1E021B+%1FlTrainerbd.%1E032@+%1Fg11%1Fa1.+Ausg.%1E034D+%1Fa129+S.%1E034M+%1FaIll.%1E047A+%1FSFE%1Famar%1E047A+%1FSERW%1Fasal%1E047I+%1Fu%24%1Fc04%1FdDNB%1Fe1%1E)
+1) [a way to show pathways in combination with values](https://metafacture.org/playground/?flux=inputFile%0A%7C+open-file%0A%7C+as-lines%0A%7C+decode-pica%0A%7C+fix%28%22nothing%28%29%22%2C+repeatedFieldsToEntities+%3D+%22true%22%29%0A%7C+flatten%0A%7C+encode-literals%0A%7C+print%0A%3B&data=001@+%1Fa5%1F01-2%1E001A+%1F01100%3A15-10-94%1E001B+%1F09999%3A12-06-06%1Ft16%3A10%3A17.000%1E001D+%1F09999%3A99-99-99%1E001U+%1F0utf8%1E001X+%1F00%1E002@+%1F0Aag%1E003@+%1F0482147350%1E006U+%1F094%2CP05%1E007E+%1F0U+70.16407%1E007I+%1FSo%1F074057548%1E011@+%1Fa1970%1E017A+%1Farh%1E021A+%1FaDie+@Berufsfreiheit+der+Arbeitnehmer+und+ihre+Ausgestaltung+in+vo%CC%88lkerrechtlichen+Vertra%CC%88gen%1FdEine+Grundrechtsbetrachtg%1E028A+%1F9106884905%1F7Tn3%1FAgnd%1F0106884905%1FaProjahn%1FdHorst+D.%1E033A+%1FpWu%CC%88rzburg%1E034D+%1FaXXXVIII%2C+165+S.%1E034I+%1Fa8%1E037C+%1FaWu%CC%88rzburg%2C+Jur.+F.%2C+Diss.+v.+7.+Aug.+1970%1E%0A001@+%1F01%1Fa5%1E001A+%1F01140%3A08-12-99%1E001B+%1F09999%3A05-01-08%1Ft22%3A57%3A29.000%1E001D+%1F09999%3A99-99-99%1E001U+%1F0utf8%1E001X+%1F00%1E002@+%1F0Aa%1E003@+%1F0958090564%1E004A+%1Ffkart.+%3A+DM+9.70%2C+EUR+4.94%2C+sfr+8.00%2C+S+68.00%1E006U+%1F000%2CB05%2C0285%1E007I+%1FSo%1F076088278%1E011@+%1Fa1999%1E017A+%1Farb%1Fasi%1E019@+%1FaXA-AT%1E021A+%1FaZukunft+Bildung%1FhPolitische+Akademie.+%5BHrsg.+von+Gu%CC%88nther+R.+Burkert-Dottolo+und+Bernhard+Moser%5D%1E028C+%1F9130681849%1F7Tp1%1FVpiz%1FAgnd%1F0130681849%1FE1952%1FaBurkert%1FdGu%CC%88nther+R.%1FBHrsg.%1E033A+%1FpWien%1FnPolit.+Akad.%1E034D+%1Fa79+S.%1E034I+%1Fa24+cm%1E036F+%1Fx299+12%1F9551720077%1FgAdn%1F7Tb1%1FAgnd%1F01040469-7%1FaPolitische+Akademie%1FgWien%1FYPA-Information%1FhPolitische+Akademie%2C+WB%1FpWien%1FJPolitische+Akad.%2C+WB%1Fl99%2C2%1E036F/01+%1Fx12%1F9025841467%1FgAdvz%1Fi2142105-5%1FYAktuelle+Fragen+der+Politik%1FhPolitische+Akademie%1FpWien%1FJPolitische+Akad.+der+O%CC%88VP%1FlBd.+2%1E045E+%1Fa22%1Fd18%1Fm370%1E047A+%1FSFE%1Fata%1E%0A001@+%1Fa5%1F01%1E001A+%1F01140%3A19-02-03%1E001B+%1F09999%3A19-06-11%1Ft01%3A20%3A13.000%1E001D+%1F09999%3A26-04-03%1E001U+%1F0utf8%1E001X+%1F00%1E002@+%1F0Aal%1E003@+%1F0361809549%1E004A+%1FfHlw.%1E006U+%1F000%2CL01%1E006U+%1F004%2CP01-s-41%1E006U+%1F004%2CP01-f-21%1E007G+%1FaDNB%1F0361809549%1E007I+%1FSo%1F072658383%1E007M+%1F04413/0275%1E011@+%1Fa1925%1E019@+%1FaXA-DXDE%1FaXA-DE%1E021A+%1FaHundert+Jahre+Buchdrucker-Innung+Hamburg%1FdWesen+u.+Werden+d.+Vereinigungen+Hamburger+Buchdruckereibesitzer+1825-1925+%3B+Gedenkschrift+zur+100.+Wiederkehr+d.+Gru%CC%88ndungstages%2C+verf.+im+Auftr.+d.+Vorstandes+d.+Buchdrucker-Innung+%28Freie+Innung%29+zu+Hamburg%1FhFriedrich+Voeltzer%1E028A+%1F9101386281%1F7Tp1%1FVpiz%1FAgnd%1F0101386281%1FE1895%1FaVo%CC%88ltzer%1FdFriedrich%1E033A+%1FpHamburg%1FnBuchdrucker-Innung+%28Freie+Innung%29%1E033A+%1FpHamburg%1Fn%5BVerlagsbuchh.+Broschek+%26+Co.%5D%1E034D+%1Fa44+S.%1E034I+%1Fa4%1E%0A001@+%1Fa5%1F01-3%1E001A+%1F01240%3A01-08-95%1E001B+%1F09999%3A24-09-10%1Ft17%3A42%3A20.000%1E001D+%1F09999%3A99-99-99%1E001U+%1F0utf8%1E001X+%1F00%1E002@+%1F0Af%1E003@+%1F0945184085%1E004A+%1F03-89007-044-2%1FfGewebe+%3A+DM+198.00%2C+sfr+198.00%2C+S+1386.00%1E006T+%1F095%2CN35%2C0856%1E006U+%1F095%2CA48%2C1186%1E006U+%1F010%2CP01%1E007I+%1FSo%1F061975997%1E011@+%1Fa1995%1E017A+%1Fara%1E021A+%1Fx213%1F9550711899%1FYNeues+Handbuch+der+Musikwissenschaft%1Fhhrsg.+von+Carl+Dahlhaus.+Fortgef.+von+Hermann+Danuser%1FpLaaber%1FJLaaber-Verl.%1FS48%1F03-89007-030-2%1FgAc%1E021B+%1FlBd.+13.%1FaRegister%1Fhzsgest.+von+Hans-Joachim+Hinrichsen%1E028C+%1F9121445453%1F7Tp3%1FVpiz%1FAgnd%1F0121445453%1FE1952%1FaHinrichsen%1FdHans-Joachim%1E034D+%1FaVIII%2C+408+S.%1E045V+%1F9090001001%1E047A+%1FSFE%1Fagb/fm%1E%0A001@+%1F01-2%1Fa5%1E001A+%1F01239%3A18-08-11%1E001B+%1F09999%3A05-09-11%1Ft23%3A31%3A44.000%1E001D+%1F01240%3A30-08-11%1E001U+%1F0utf8%1E001X+%1F00%1E002@+%1F0Af%1E003@+%1F01014417392%1E004A+%1Ffkart.%1E006U+%1F011%2CA37%1E007G+%1FaDNB%1F01014417392%1E007I+%1FSo%1F0752937239%1E010@+%1Fager%1E011@+%1Fa2011%1E017A+%1Fara%1Fasf%1E021A+%1Fxtr%1F91014809657%1F7Tp3%1FVpiz%1FAgnd%1F01034622773%1FE1958%1FaLu%CC%88beck%1FdMonika%1FYPersonalwirtschaft+mit+DATEV%1FhMonika+Lu%CC%88beck+%3B+Helmut+Lu%CC%88beck%1FpBodenheim%1FpWien%1FJHerdt%1FRXA-DE%1FS650%1FgAc%1E021B+%1FlTrainerbd.%1E032@+%1Fg11%1Fa1.+Ausg.%1E034D+%1Fa129+S.%1E034M+%1FaIll.%1E047A+%1FSFE%1Famar%1E047A+%1FSERW%1Fasal%1E047I+%1Fu%24%1Fc04%1FdDNB%1Fe1%1E)
-2) [Here is a way to collect and count all paths in all records by using the `list-fix-paths` command.](https://metafacture.org/playground/?flux=inputFile%0A%7C+open-file%0A%7C+as-lines%0A%7C+decode-pica%0A%7C+list-fix-paths%0A%7C+print%0A%3B&data=001@+%1Fa5%1F01-2%1E001A+%1F01100%3A15-10-94%1E001B+%1F09999%3A12-06-06%1Ft16%3A10%3A17.000%1E001D+%1F09999%3A99-99-99%1E001U+%1F0utf8%1E001X+%1F00%1E002@+%1F0Aag%1E003@+%1F0482147350%1E006U+%1F094%2CP05%1E007E+%1F0U+70.16407%1E007I+%1FSo%1F074057548%1E011@+%1Fa1970%1E017A+%1Farh%1E021A+%1FaDie+@Berufsfreiheit+der+Arbeitnehmer+und+ihre+Ausgestaltung+in+vo%CC%88lkerrechtlichen+Vertra%CC%88gen%1FdEine+Grundrechtsbetrachtg%1E028A+%1F9106884905%1F7Tn3%1FAgnd%1F0106884905%1FaProjahn%1FdHorst+D.%1E033A+%1FpWu%CC%88rzburg%1E034D+%1FaXXXVIII%2C+165+S.%1E034I+%1Fa8%1E037C+%1FaWu%CC%88rzburg%2C+Jur.+F.%2C+Diss.+v.+7.+Aug.+1970%1E%0A001@+%1F01%1Fa5%1E001A+%1F01140%3A08-12-99%1E001B+%1F09999%3A05-01-08%1Ft22%3A57%3A29.000%1E001D+%1F09999%3A99-99-99%1E001U+%1F0utf8%1E001X+%1F00%1E002@+%1F0Aa%1E003@+%1F0958090564%1E004A+%1Ffkart.+%3A+DM+9.70%2C+EUR+4.94%2C+sfr+8.00%2C+S+68.00%1E006U+%1F000%2CB05%2C0285%1E007I+%1FSo%1F076088278%1E011@+%1Fa1999%1E017A+%1Farb%1Fasi%1E019@+%1FaXA-AT%1E021A+%1FaZukunft+Bildung%1FhPolitische+Akademie.+%5BHrsg.+von+Gu%CC%88nther+R.+Burkert-Dottolo+und+Bernhard+Moser%5D%1E028C+%1F9130681849%1F7Tp1%1FVpiz%1FAgnd%1F0130681849%1FE1952%1FaBurkert%1FdGu%CC%88nther+R.%1FBHrsg.%1E033A+%1FpWien%1FnPolit.+Akad.%1E034D+%1Fa79+S.%1E034I+%1Fa24+cm%1E036F+%1Fx299+12%1F9551720077%1FgAdn%1F7Tb1%1FAgnd%1F01040469-7%1FaPolitische+Akademie%1FgWien%1FYPA-Information%1FhPolitische+Akademie%2C+WB%1FpWien%1FJPolitische+Akad.%2C+WB%1Fl99%2C2%1E036F/01+%1Fx12%1F9025841467%1FgAdvz%1Fi2142105-5%1FYAktuelle+Fragen+der+Politik%1FhPolitische+Akademie%1FpWien%1FJPolitische+Akad.+der+O%CC%88VP%1FlBd.+2%1E045E+%1Fa22%1Fd18%1Fm370%1E047A+%1FSFE%1Fata%1E%0A001@+%1Fa5%1F01%1E001A+%1F01140%3A19-02-03%1E001B+%1F09999%3A19-06-11%1Ft01%3A20%3A13.000%1E001D+%1F09999%3A26-04-03%1E001U+%1F0utf8%1E001X+%1F00%1E002@+%1F0Aal%1E003@+%1F0361809549%1E004A+%1FfHlw.%1E006U+%1F000%2CL01%1E006U+%1F004%2CP01-s-41%1E006U+%1F004%2CP01-f-21%1E007G+%1FaDNB%1F0361809549%1E007I+%1FSo%1F072658383%1E007M+%1F04413/0275%1E011@+%1Fa1925%1E019@+%1FaXA-DXDE%1FaXA-DE%1E021A+%1FaHundert+Jahre+Buchdrucker-Innung+Hamburg%1FdWesen+u.+Werden+d.+Vereinigungen+Hamburger+Buchdruckereibesitzer+1825-1925+%3B+Gedenkschrift+zur+100.+Wiederkehr+d.+Gru%CC%88ndungstages%2C+verf.+im+Auftr.+d.+Vorstandes+d.+Buchdrucker-Innung+%28Freie+Innung%29+zu+Hamburg%1FhFriedrich+Voeltzer%1E028A+%1F9101386281%1F7Tp1%1FVpiz%1FAgnd%1F0101386281%1FE1895%1FaVo%CC%88ltzer%1FdFriedrich%1E033A+%1FpHamburg%1FnBuchdrucker-Innung+%28Freie+Innung%29%1E033A+%1FpHamburg%1Fn%5BVerlagsbuchh.+Broschek+%26+Co.%5D%1E034D+%1Fa44+S.%1E034I+%1Fa4%1E%0A001@+%1Fa5%1F01-3%1E001A+%1F01240%3A01-08-95%1E001B+%1F09999%3A24-09-10%1Ft17%3A42%3A20.000%1E001D+%1F09999%3A99-99-99%1E001U+%1F0utf8%1E001X+%1F00%1E002@+%1F0Af%1E003@+%1F0945184085%1E004A+%1F03-89007-044-2%1FfGewebe+%3A+DM+198.00%2C+sfr+198.00%2C+S+1386.00%1E006T+%1F095%2CN35%2C0856%1E006U+%1F095%2CA48%2C1186%1E006U+%1F010%2CP01%1E007I+%1FSo%1F061975997%1E011@+%1Fa1995%1E017A+%1Fara%1E021A+%1Fx213%1F9550711899%1FYNeues+Handbuch+der+Musikwissenschaft%1Fhhrsg.+von+Carl+Dahlhaus.+Fortgef.+von+Hermann+Danuser%1FpLaaber%1FJLaaber-Verl.%1FS48%1F03-89007-030-2%1FgAc%1E021B+%1FlBd.+13.%1FaRegister%1Fhzsgest.+von+Hans-Joachim+Hinrichsen%1E028C+%1F9121445453%1F7Tp3%1FVpiz%1FAgnd%1F0121445453%1FE1952%1FaHinrichsen%1FdHans-Joachim%1E034D+%1FaVIII%2C+408+S.%1E045V+%1F9090001001%1E047A+%1FSFE%1Fagb/fm%1E%0A001@+%1F01-2%1Fa5%1E001A+%1F01239%3A18-08-11%1E001B+%1F09999%3A05-09-11%1Ft23%3A31%3A44.000%1E001D+%1F01240%3A30-08-11%1E001U+%1F0utf8%1E001X+%1F00%1E002@+%1F0Af%1E003@+%1F01014417392%1E004A+%1Ffkart.%1E006U+%1F011%2CA37%1E007G+%1FaDNB%1F01014417392%1E007I+%1FSo%1F0752937239%1E010@+%1Fager%1E011@+%1Fa2011%1E017A+%1Fara%1Fasf%1E021A+%1Fxtr%1F91014809657%1F7Tp3%1FVpiz%1FAgnd%1F01034622773%1FE1958%1FaLu%CC%88beck%1FdMonika%1FYPersonalwirtschaft+mit+DATEV%1FhMonika+Lu%CC%88beck+%3B+Helmut+Lu%CC%88beck%1FpBodenheim%1FpWien%1FJHerdt%1FRXA-DE%1FS650%1FgAc%1E021B+%1FlTrainerbd.%1E032@+%1Fg11%1Fa1.+Ausg.%1E034D+%1Fa129+S.%1E034M+%1FaIll.%1E047A+%1FSFE%1Famar%1E047A+%1FSERW%1Fasal%1E047I+%1Fu%24%1Fc04%1FdDNB%1Fe1%1E)
+2) [a way to collect and count all paths in all records by using the `list-fix-paths` command](https://metafacture.org/playground/?flux=inputFile%0A%7C+open-file%0A%7C+as-lines%0A%7C+decode-pica%0A%7C+list-fix-paths%0A%7C+print%0A%3B&data=001@+%1Fa5%1F01-2%1E001A+%1F01100%3A15-10-94%1E001B+%1F09999%3A12-06-06%1Ft16%3A10%3A17.000%1E001D+%1F09999%3A99-99-99%1E001U+%1F0utf8%1E001X+%1F00%1E002@+%1F0Aag%1E003@+%1F0482147350%1E006U+%1F094%2CP05%1E007E+%1F0U+70.16407%1E007I+%1FSo%1F074057548%1E011@+%1Fa1970%1E017A+%1Farh%1E021A+%1FaDie+@Berufsfreiheit+der+Arbeitnehmer+und+ihre+Ausgestaltung+in+vo%CC%88lkerrechtlichen+Vertra%CC%88gen%1FdEine+Grundrechtsbetrachtg%1E028A+%1F9106884905%1F7Tn3%1FAgnd%1F0106884905%1FaProjahn%1FdHorst+D.%1E033A+%1FpWu%CC%88rzburg%1E034D+%1FaXXXVIII%2C+165+S.%1E034I+%1Fa8%1E037C+%1FaWu%CC%88rzburg%2C+Jur.+F.%2C+Diss.+v.+7.+Aug.+1970%1E%0A001@+%1F01%1Fa5%1E001A+%1F01140%3A08-12-99%1E001B+%1F09999%3A05-01-08%1Ft22%3A57%3A29.000%1E001D+%1F09999%3A99-99-99%1E001U+%1F0utf8%1E001X+%1F00%1E002@+%1F0Aa%1E003@+%1F0958090564%1E004A+%1Ffkart.+%3A+DM+9.70%2C+EUR+4.94%2C+sfr+8.00%2C+S+68.00%1E006U+%1F000%2CB05%2C0285%1E007I+%1FSo%1F076088278%1E011@+%1Fa1999%1E017A+%1Farb%1Fasi%1E019@+%1FaXA-AT%1E021A+%1FaZukunft+Bildung%1FhPolitische+Akademie.+%5BHrsg.+von+Gu%CC%88nther+R.+Burkert-Dottolo+und+Bernhard+Moser%5D%1E028C+%1F9130681849%1F7Tp1%1FVpiz%1FAgnd%1F0130681849%1FE1952%1FaBurkert%1FdGu%CC%88nther+R.%1FBHrsg.%1E033A+%1FpWien%1FnPolit.+Akad.%1E034D+%1Fa79+S.%1E034I+%1Fa24+cm%1E036F+%1Fx299+12%1F9551720077%1FgAdn%1F7Tb1%1FAgnd%1F01040469-7%1FaPolitische+Akademie%1FgWien%1FYPA-Information%1FhPolitische+Akademie%2C+WB%1FpWien%1FJPolitische+Akad.%2C+WB%1Fl99%2C2%1E036F/01+%1Fx12%1F9025841467%1FgAdvz%1Fi2142105-5%1FYAktuelle+Fragen+der+Politik%1FhPolitische+Akademie%1FpWien%1FJPolitische+Akad.+der+O%CC%88VP%1FlBd.+2%1E045E+%1Fa22%1Fd18%1Fm370%1E047A+%1FSFE%1Fata%1E%0A001@+%1Fa5%1F01%1E001A+%1F01140%3A19-02-03%1E001B+%1F09999%3A19-06-11%1Ft01%3A20%3A13.000%1E001D+%1F09999%3A26-04-03%1E001U+%1F0utf8%1E001X+%1F00%1E002@+%1F0Aal%1E003@+%1F0361809549%1E004A+%1FfHlw.%1E006U+%1F000%2CL01%1E006U+%1F004%2CP01-s-41%1E006U+%1F004%2CP01-f-21%1E007G+%1FaDNB%1F0361809549%1E007I+%1FSo%1F072658383%1E007M+%1F04413/0275%1E011@+%1Fa1925%1E019@+%1FaXA-DXDE%1FaXA-DE%1E021A+%1FaHundert+Jahre+Buchdrucker-Innung+Hamburg%1FdWesen+u.+Werden+d.+Vereinigungen+Hamburger+Buchdruckereibesitzer+1825-1925+%3B+Gedenkschrift+zur+100.+Wiederkehr+d.+Gru%CC%88ndungstages%2C+verf.+im+Auftr.+d.+Vorstandes+d.+Buchdrucker-Innung+%28Freie+Innung%29+zu+Hamburg%1FhFriedrich+Voeltzer%1E028A+%1F9101386281%1F7Tp1%1FVpiz%1FAgnd%1F0101386281%1FE1895%1FaVo%CC%88ltzer%1FdFriedrich%1E033A+%1FpHamburg%1FnBuchdrucker-Innung+%28Freie+Innung%29%1E033A+%1FpHamburg%1Fn%5BVerlagsbuchh.+Broschek+%26+Co.%5D%1E034D+%1Fa44+S.%1E034I+%1Fa4%1E%0A001@+%1Fa5%1F01-3%1E001A+%1F01240%3A01-08-95%1E001B+%1F09999%3A24-09-10%1Ft17%3A42%3A20.000%1E001D+%1F09999%3A99-99-99%1E001U+%1F0utf8%1E001X+%1F00%1E002@+%1F0Af%1E003@+%1F0945184085%1E004A+%1F03-89007-044-2%1FfGewebe+%3A+DM+198.00%2C+sfr+198.00%2C+S+1386.00%1E006T+%1F095%2CN35%2C0856%1E006U+%1F095%2CA48%2C1186%1E006U+%1F010%2CP01%1E007I+%1FSo%1F061975997%1E011@+%1Fa1995%1E017A+%1Fara%1E021A+%1Fx213%1F9550711899%1FYNeues+Handbuch+der+Musikwissenschaft%1Fhhrsg.+von+Carl+Dahlhaus.+Fortgef.+von+Hermann+Danuser%1FpLaaber%1FJLaaber-Verl.%1FS48%1F03-89007-030-2%1FgAc%1E021B+%1FlBd.+13.%1FaRegister%1Fhzsgest.+von+Hans-Joachim+Hinrichsen%1E028C+%1F9121445453%1F7Tp3%1FVpiz%1FAgnd%1F0121445453%1FE1952%1FaHinrichsen%1FdHans-Joachim%1E034D+%1FaVIII%2C+408+S.%1E045V+%1F9090001001%1E047A+%1FSFE%1Fagb/fm%1E%0A001@+%1F01-2%1Fa5%1E001A+%1F01239%3A18-08-11%1E001B+%1F09999%3A05-09-11%1Ft23%3A31%3A44.000%1E001D+%1F01240%3A30-08-11%1E001U+%1F0utf8%1E001X+%1F00%1E002@+%1F0Af%1E003@+%1F01014417392%1E004A+%1Ffkart.%1E006U+%1F011%2CA37%1E007G+%1FaDNB%1F01014417392%1E007I+%1FSo%1F0752937239%1E010@+%1Fager%1E011@+%1Fa2011%1E017A+%1Fara%1Fasf%1E021A+%1Fxtr%1F91014809657%1F7Tp3%1FVpiz%1FAgnd%1F01034622773%1FE1958%1FaLu%CC%88beck%1FdMonika%1FYPersonalwirtschaft+mit+DATEV%1FhMonika+Lu%CC%88beck+%3B+Helmut+Lu%CC%88beck%1FpBodenheim%1FpWien%1FJHerdt%1FRXA-DE%1FS650%1FgAc%1E021B+%1FlTrainerbd.%1E032@+%1Fg11%1Fa1.+Ausg.%1E034D+%1Fa129+S.%1E034M+%1FaIll.%1E047A+%1FSFE%1Famar%1E047A+%1FSERW%1Fasal%1E047I+%1Fu%24%1Fc04%1FdDNB%1Fe1%1E)
## Bonus: XML in MF and their paths
@@ -219,11 +219,11 @@ There exists multiple ways to find out the path-names of records. Two examples:
The path for the value `This is the title` is not `title` but `title.value`
-XMLs are not just simple elements with key-pair values or objects with subfields but each elemnt can have additional attributs. In Metafacture the xml decoder (`decode-xml` with `handle-generic-xml`) groups the attributes and values as subfields of an object.
+XMLs are not just simple elements with key-pair values or objects with subfields but each element can have additional attributes. In Metafacture the XML decoder (`decode-xml` with `handle-generic-xml`) groups the attributes and values as subfields of an object.
`This is the title`
-The path for the different attributs and elements are the following:
+The path for the different attributes and elements are the following:
```yaml
title.value
@@ -231,7 +231,7 @@ title.type
title.lang
```
-If you want to create xml with attributes then you need to map to this structure too. We will come back to lection working with xml [in lesson 10](10_Working_with_XML.md).
+If you want to create XML with attributes you need to map to this structure, too. We will come back to working with XML [in lesson 10](10_Working_with_XML).
---------------
diff --git a/docs/05-More-Fix-Concepts.md b/docs/05-More-Fix-Concepts.md
index 0a6ca47..b372034 100644
--- a/docs/05-More-Fix-Concepts.md
+++ b/docs/05-More-Fix-Concepts.md
@@ -9,7 +9,7 @@ parent: Tutorial
We already learned about simple Fixes aka *[Fix functions](https://metafacture.github.io/metafacture-documentation/docs/fix/Fix-functions.html)* but there are three additional concepts in Fix: selector, conditionals and binds.
-These Fix concepts were introduced by Catmandu (see [functions](https://librecat.org/Catmandu/#functions), [selector](https://librecat.org/Catmandu/#selectors), [conditionals](https://librecat.org/Catmandu/#conditionals) and [binds](https://librecat.org/Catmandu/#binds)). But be aware that Metafacture Fix does not support all of the specific functions, selectors, conditionals and binds from Catmandu. Check the documentation for a full overview of the supported [Fix functions](https://metafacture.org/metafacture-documentation/docs/fix/Fix-functions.html).
+These Fix concepts were introduced by Catmandu (see [functions](https://librecat.org/Catmandu/#functions), [selector](https://librecat.org/Catmandu/#selectors), [conditionals](https://librecat.org/Catmandu/#conditionals) and [binds](https://librecat.org/Catmandu/#binds)). Be aware that Metafacture Fix does not support all of the specific functions, selectors, conditionals and binds from Catmandu. Check the documentation for a full overview of the supported [Fix functions](https://metafacture.org/metafacture-documentation/docs/fix/Fix-functions.html).
## Additional concepts
@@ -47,16 +47,16 @@ Fix functions are used to add, change, remove or otherwise manipulate elements.
The other three concepts help when you intend to use more complex transformations:
-*[Conditionals](https://metafacture.github.io/metafacture-documentation/docs/fix/Fix-functions.html#conditionals)* are used to control the processing of fix functions. The included fix functions are not process with every workflow but only under certain conditions.
+*[Conditionals](https://metafacture.github.io/metafacture-documentation/docs/fix/Fix-functions.html#conditionals)* are used to control the processing of Fix functions. The included Fix functions are not processed with every workflow but only under certain conditions.
*[Selectors](https://metafacture.github.io/metafacture-documentation/docs/fix/Fix-functions.html#selectors)* can be used to filter the records you want.
-*[Binds](https://metafacture.github.io/metafacture-documentation/docs/fix/Fix-functions.html#binds)* are wrappers for one or more fixes. They give extra control functionality for fixes such as loops. All binds have the same syntax:
+*[Binds](https://metafacture.github.io/metafacture-documentation/docs/fix/Fix-functions.html#binds)* are wrappers for one or more Fixes. They give extra control functionality for Fixes such as loops. All binds have the same syntax:
```perl
-do Bind(params,…)
- fix(..)
- fix(..)
+do Bind(params,...)
+ Fix(...)
+ Fix(...)
end
```
@@ -64,7 +64,7 @@ end
Conditionals are a common concept in programming and scripting languages. They control processes, in our scenario: transformations, with regard to specific requirements.
-For example, [we have records some of them are of type book](https://metafacture.org/playground/?flux=inputFile%0A%7C+open-file%0A%7C+as-records%0A%7C+decode-yaml%0A%7C+fix%28transformationFile%29%0A%7C+encode-yaml%0A%7C+print%0A%3B&transformation=add_field%28%22type%22%2C%22BibliographicResource%22%29&data=---%0Aname%3A+%22The+13+1/2+lives+of+Captain+Bluebear%22%0Amedium%3A+%22Book%22%0Aauthor%3A+%22Walter+Moers%22%0Alanguage%3A+%22eng%22%0A%0A---%0Aname%3A+%22The+13+1/2+lives+of+Captain+Bluebear%22%0Amedium%3A+%22eBook%22%0Aauthor%3A+%22Walter+Moers%22%0Alanguage%3A+%22eng%22%0A%0A---%0Aname%3A+%22Die+13%C2%BD+Leben+des+K%C3%A4pt%E2%80%99n+Blaub%C3%A4r%22%0Amedium%3A+%22Book%22%0Aauthor%3A+%22Walter+Moers%22%0Alanguage%3A+%22ger%22%0A%0A---%0Aname%3A+%22The+13+1/2+lives+of+Captain+Bluebear%22%0Amedium%3A+%22Audio+Book%22%0Aauthor%3A+%22Walter+Moers%22%0Anarrator%3A+%22Bronson+Pinchot%22%0Alanguage%3A+%22eng%22%0A%0A---%0Aname%3A+%22K%C3%A4pt%27n+Blaub%C3%A4r+-+Der+Film%22%0Amedium%3A+%22Movie%22%0Aauthor%3A+%22Walter+Moers%22%0Adirector%3A+%22Hayo+Freitag%22%0Alanguage%3A+%22ger%22):
+For example, [some of these records are of type book](https://metafacture.org/playground/?flux=inputFile%0A%7C+open-file%0A%7C+as-records%0A%7C+decode-yaml%0A%7C+fix%28transformationFile%29%0A%7C+encode-yaml%0A%7C+print%0A%3B&transformation=add_field%28%22type%22%2C%22BibliographicResource%22%29&data=---%0Aname%3A+%22The+13+1/2+lives+of+Captain+Bluebear%22%0Amedium%3A+%22Book%22%0Aauthor%3A+%22Walter+Moers%22%0Alanguage%3A+%22eng%22%0A%0A---%0Aname%3A+%22The+13+1/2+lives+of+Captain+Bluebear%22%0Amedium%3A+%22eBook%22%0Aauthor%3A+%22Walter+Moers%22%0Alanguage%3A+%22eng%22%0A%0A---%0Aname%3A+%22Die+13%C2%BD+Leben+des+K%C3%A4pt%E2%80%99n+Blaub%C3%A4r%22%0Amedium%3A+%22Book%22%0Aauthor%3A+%22Walter+Moers%22%0Alanguage%3A+%22ger%22%0A%0A---%0Aname%3A+%22The+13+1/2+lives+of+Captain+Bluebear%22%0Amedium%3A+%22Audio+Book%22%0Aauthor%3A+%22Walter+Moers%22%0Anarrator%3A+%22Bronson+Pinchot%22%0Alanguage%3A+%22eng%22%0A%0A---%0Aname%3A+%22K%C3%A4pt%27n+Blaub%C3%A4r+-+Der+Film%22%0Amedium%3A+%22Movie%22%0Aauthor%3A+%22Walter+Moers%22%0Adirector%3A+%22Hayo+Freitag%22%0Alanguage%3A+%22ger%22):
```yaml
---
@@ -114,7 +114,7 @@ if all_contain("medium","Book")
end
```
-or [if medium is **exact** Book](https://metafacture.org/playground/?flux=inputFile%0A%7C+open-file%0A%7C+as-records%0A%7C+decode-yaml%0A%7C+fix%28transformationFile%29%0A%7C+encode-yaml%0A%7C+print%0A%3B&transformation=if+all_equal%28%22medium%22%2C%22Book%22%29%0A++++add_field%28%22type%22%2C%22BibliographicResource%22%29%0Aend&data=---%0Aname%3A+%22The+13+1/2+lives+of+Captain+Bluebear%22%0Amedium%3A+%22Book%22%0Aauthor%3A+%22Walter+Moers%22%0Alanguage%3A+%22eng%22%0A%0A---%0Aname%3A+%22The+13+1/2+lives+of+Captain+Bluebear%22%0Amedium%3A+%22eBook%22%0Aauthor%3A+%22Walter+Moers%22%0Alanguage%3A+%22eng%22%0A%0A---%0Aname%3A+%22Die+13%C2%BD+Leben+des+K%C3%A4pt%E2%80%99n+Blaub%C3%A4r%22%0Amedium%3A+%22Book%22%0Aauthor%3A+%22Walter+Moers%22%0Alanguage%3A+%22ger%22%0A%0A---%0Aname%3A+%22The+13+1/2+lives+of+Captain+Bluebear%22%0Amedium%3A+%22Audio+Book%22%0Aauthor%3A+%22Walter+Moers%22%0Anarrator%3A+%22Bronson+Pinchot%22%0Alanguage%3A+%22eng%22%0A%0A---%0Aname%3A+%22K%C3%A4pt%27n+Blaub%C3%A4r+-+Der+Film%22%0Amedium%3A+%22Movie%22%0Aauthor%3A+%22Walter+Moers%22%0Adirector%3A+%22Hayo+Freitag%22%0Alanguage%3A+%22ger%22):
+or [if medium is **exactly** Book](https://metafacture.org/playground/?flux=inputFile%0A%7C+open-file%0A%7C+as-records%0A%7C+decode-yaml%0A%7C+fix%28transformationFile%29%0A%7C+encode-yaml%0A%7C+print%0A%3B&transformation=if+all_equal%28%22medium%22%2C%22Book%22%29%0A++++add_field%28%22type%22%2C%22BibliographicResource%22%29%0Aend&data=---%0Aname%3A+%22The+13+1/2+lives+of+Captain+Bluebear%22%0Amedium%3A+%22Book%22%0Aauthor%3A+%22Walter+Moers%22%0Alanguage%3A+%22eng%22%0A%0A---%0Aname%3A+%22The+13+1/2+lives+of+Captain+Bluebear%22%0Amedium%3A+%22eBook%22%0Aauthor%3A+%22Walter+Moers%22%0Alanguage%3A+%22eng%22%0A%0A---%0Aname%3A+%22Die+13%C2%BD+Leben+des+K%C3%A4pt%E2%80%99n+Blaub%C3%A4r%22%0Amedium%3A+%22Book%22%0Aauthor%3A+%22Walter+Moers%22%0Alanguage%3A+%22ger%22%0A%0A---%0Aname%3A+%22The+13+1/2+lives+of+Captain+Bluebear%22%0Amedium%3A+%22Audio+Book%22%0Aauthor%3A+%22Walter+Moers%22%0Anarrator%3A+%22Bronson+Pinchot%22%0Alanguage%3A+%22eng%22%0A%0A---%0Aname%3A+%22K%C3%A4pt%27n+Blaub%C3%A4r+-+Der+Film%22%0Amedium%3A+%22Movie%22%0Aauthor%3A+%22Walter+Moers%22%0Adirector%3A+%22Hayo+Freitag%22%0Alanguage%3A+%22ger%22):
```perl
if all_equal("medium","Book")
@@ -134,7 +134,7 @@ end
### if...else
-[You can add an `else`-block to any `if` conditional if you want to process fixes only if the contition is `falsy`:](https://metafacture.org/playground/?flux=inputFile%0A%7C+open-file%0A%7C+as-records%0A%7C+decode-yaml%0A%7C+fix%28transformationFile%29%0A%7C+encode-yaml%0A%7C+print%0A%3B&transformation=if+all_equal%28%22medium%22%2C%22Book%22%29%0A++++add_field%28%22type%22%2C%22BibliographicResource%22%29%0Aelse%0A++++add_field%28%22type%22%2C%22Other%22%29%0Aend&data=---%0Aname%3A+%22The+13+1/2+lives+of+Captain+Bluebear%22%0Amedium%3A+%22Book%22%0Aauthor%3A+%22Walter+Moers%22%0Alanguage%3A+%22eng%22%0A%0A---%0Aname%3A+%22The+13+1/2+lives+of+Captain+Bluebear%22%0Amedium%3A+%22eBook%22%0Aauthor%3A+%22Walter+Moers%22%0Alanguage%3A+%22eng%22%0A%0A---%0Aname%3A+%22Die+13%C2%BD+Leben+des+K%C3%A4pt%E2%80%99n+Blaub%C3%A4r%22%0Amedium%3A+%22Book%22%0Aauthor%3A+%22Walter+Moers%22%0Alanguage%3A+%22ger%22%0A%0A---%0Aname%3A+%22The+13+1/2+lives+of+Captain+Bluebear%22%0Amedium%3A+%22Audio+Book%22%0Aauthor%3A+%22Walter+Moers%22%0Anarrator%3A+%22Bronson+Pinchot%22%0Alanguage%3A+%22eng%22%0A%0A---%0Aname%3A+%22K%C3%A4pt%27n+Blaub%C3%A4r+-+Der+Film%22%0Amedium%3A+%22Movie%22%0Aauthor%3A+%22Walter+Moers%22%0Adirector%3A+%22Hayo+Freitag%22%0Alanguage%3A+%22ger%22)
+[You can add an `else`-block to any `if` conditional if you want to process Fixes only if the condition is `false`:](https://metafacture.org/playground/?flux=inputFile%0A%7C+open-file%0A%7C+as-records%0A%7C+decode-yaml%0A%7C+fix%28transformationFile%29%0A%7C+encode-yaml%0A%7C+print%0A%3B&transformation=if+all_equal%28%22medium%22%2C%22Book%22%29%0A++++add_field%28%22type%22%2C%22BibliographicResource%22%29%0Aelse%0A++++add_field%28%22type%22%2C%22Other%22%29%0Aend&data=---%0Aname%3A+%22The+13+1/2+lives+of+Captain+Bluebear%22%0Amedium%3A+%22Book%22%0Aauthor%3A+%22Walter+Moers%22%0Alanguage%3A+%22eng%22%0A%0A---%0Aname%3A+%22The+13+1/2+lives+of+Captain+Bluebear%22%0Amedium%3A+%22eBook%22%0Aauthor%3A+%22Walter+Moers%22%0Alanguage%3A+%22eng%22%0A%0A---%0Aname%3A+%22Die+13%C2%BD+Leben+des+K%C3%A4pt%E2%80%99n+Blaub%C3%A4r%22%0Amedium%3A+%22Book%22%0Aauthor%3A+%22Walter+Moers%22%0Alanguage%3A+%22ger%22%0A%0A---%0Aname%3A+%22The+13+1/2+lives+of+Captain+Bluebear%22%0Amedium%3A+%22Audio+Book%22%0Aauthor%3A+%22Walter+Moers%22%0Anarrator%3A+%22Bronson+Pinchot%22%0Alanguage%3A+%22eng%22%0A%0A---%0Aname%3A+%22K%C3%A4pt%27n+Blaub%C3%A4r+-+Der+Film%22%0Amedium%3A+%22Movie%22%0Aauthor%3A+%22Walter+Moers%22%0Adirector%3A+%22Hayo+Freitag%22%0Alanguage%3A+%22ger%22)
```perl
if all_equal("medium","Book")
@@ -146,7 +146,7 @@ end
### if...elsif(...else)
-[You can also use additional `elsif`-blocks in as part of an `if`-conditional if you want to process data if the previous contitional is `falsy` but add a condition when the defined transformations should be processed:](https://metafacture.org/playground/?flux=inputFile%0A%7C+open-file%0A%7C+as-records%0A%7C+decode-yaml%0A%7C+fix%28transformationFile%29%0A%7C+encode-yaml%0A%7C+print%0A%3B&transformation=if+all_equal%28%22medium%22%2C%22Book%22%29%0A++++add_field%28%22type%22%2C%22BibliographicResource%22%29%0Aelsif+all_contain%28%22medium%22%2C%22Audio%22%29%0A++++add_field%28%22type%22%2C%22AudioResource%22%29%0Aelsif+all_match%28%22medium%22%2C%22.%2AMovie.%2A%22%29%0A++++add_field%28%22type%22%2C%22AudioVisualResource%22%29%0Aelse%0A++++add_field%28%22type%22%2C%22Other%22%29%0Aend&data=---%0Aname%3A+%22The+13+1/2+lives+of+Captain+Bluebear%22%0Amedium%3A+%22Book%22%0Aauthor%3A+%22Walter+Moers%22%0Alanguage%3A+%22eng%22%0A%0A---%0Aname%3A+%22The+13+1/2+lives+of+Captain+Bluebear%22%0Amedium%3A+%22eBook%22%0Aauthor%3A+%22Walter+Moers%22%0Alanguage%3A+%22eng%22%0A%0A---%0Aname%3A+%22Die+13%C2%BD+Leben+des+K%C3%A4pt%E2%80%99n+Blaub%C3%A4r%22%0Amedium%3A+%22Book%22%0Aauthor%3A+%22Walter+Moers%22%0Alanguage%3A+%22ger%22%0A%0A---%0Aname%3A+%22The+13+1/2+lives+of+Captain+Bluebear%22%0Amedium%3A+%22Audio+Book%22%0Aauthor%3A+%22Walter+Moers%22%0Anarrator%3A+%22Bronson+Pinchot%22%0Alanguage%3A+%22eng%22%0A%0A---%0Aname%3A+%22K%C3%A4pt%27n+Blaub%C3%A4r+-+Der+Film%22%0Amedium%3A+%22Movie%22%0Aauthor%3A+%22Walter+Moers%22%0Adirector%3A+%22Hayo+Freitag%22%0Alanguage%3A+%22ger%22)
+[You can also use additional `elsif`-blocks as part of an `if`-conditional if you want to process data if the previous conditional is not met (`false`) but add a condition when the defined transformations should be processed:](https://metafacture.org/playground/?flux=inputFile%0A%7C+open-file%0A%7C+as-records%0A%7C+decode-yaml%0A%7C+fix%28transformationFile%29%0A%7C+encode-yaml%0A%7C+print%0A%3B&transformation=if+all_equal%28%22medium%22%2C%22Book%22%29%0A++++add_field%28%22type%22%2C%22BibliographicResource%22%29%0Aelsif+all_contain%28%22medium%22%2C%22Audio%22%29%0A++++add_field%28%22type%22%2C%22AudioResource%22%29%0Aelsif+all_match%28%22medium%22%2C%22.%2AMovie.%2A%22%29%0A++++add_field%28%22type%22%2C%22AudioVisualResource%22%29%0Aelse%0A++++add_field%28%22type%22%2C%22Other%22%29%0Aend&data=---%0Aname%3A+%22The+13+1/2+lives+of+Captain+Bluebear%22%0Amedium%3A+%22Book%22%0Aauthor%3A+%22Walter+Moers%22%0Alanguage%3A+%22eng%22%0A%0A---%0Aname%3A+%22The+13+1/2+lives+of+Captain+Bluebear%22%0Amedium%3A+%22eBook%22%0Aauthor%3A+%22Walter+Moers%22%0Alanguage%3A+%22eng%22%0A%0A---%0Aname%3A+%22Die+13%C2%BD+Leben+des+K%C3%A4pt%E2%80%99n+Blaub%C3%A4r%22%0Amedium%3A+%22Book%22%0Aauthor%3A+%22Walter+Moers%22%0Alanguage%3A+%22ger%22%0A%0A---%0Aname%3A+%22The+13+1/2+lives+of+Captain+Bluebear%22%0Amedium%3A+%22Audio+Book%22%0Aauthor%3A+%22Walter+Moers%22%0Anarrator%3A+%22Bronson+Pinchot%22%0Alanguage%3A+%22eng%22%0A%0A---%0Aname%3A+%22K%C3%A4pt%27n+Blaub%C3%A4r+-+Der+Film%22%0Amedium%3A+%22Movie%22%0Aauthor%3A+%22Walter+Moers%22%0Adirector%3A+%22Hayo+Freitag%22%0Alanguage%3A+%22ger%22)
```perl
if all_equal("medium","Book")
@@ -176,16 +176,16 @@ end
Selectors work in combination with conditionals to define the conditions that you want to kick out.
-For the supported selectors see: https://metafacture.github.io/metafacture-documentation/docs/fix/Fix-functions.html#selectors
+See the [list of supported selectors](https://metafacture.github.io/metafacture-documentation/docs/fix/Fix-functions.html#selectors).
## Binds
-As mentioned above [Binds](https://metafacture.github.io/metafacture-documentation/docs/fix/Fix-functions.html#binds) are wrappers for one or more fixes. They give extra control functionality for fixes such as loops. All binds have the same syntax:
+As mentioned above [Binds](https://metafacture.github.io/metafacture-documentation/docs/fix/Fix-functions.html#binds) are wrappers for one or more Fixes. They give extra control functionality for Fixes such as loops. All binds have the same syntax:
```perl
-do Bind(params,…)
- fix(..)
- fix(..)
+do Bind(params,...)
+ Fix(...)
+ Fix(...)
end
```
@@ -201,7 +201,7 @@ colours:
- green
```
-and you use the fix
+and you use the Fix
```perl
upcase("colours[].*")
@@ -222,7 +222,7 @@ result: "YELLOW is a nice color"
result: "GREEN is a nice color"
```
-If you want to only change it, under a certain condition:
+If you want to only change it under a certain condition:
```perl
if any_equal("colours[]","green")
@@ -232,7 +232,7 @@ if any_equal("colours[]","green")
end
```
-[This still transforms the all elements of an array because the conditional tests all elements not each individually.](https://metafacture.org/playground/?flux=inputFile%0A%7C+open-file%0A%7C+as-records%0A%7C+decode-yaml%0A%7C+fix%28transformationFile%29%0A%7C+encode-yaml%0A%7C+print%0A%3B&transformation=if+any_equal%28%22colours%5B%5D%22%2C%22green%22%29%0A++upcase%28%22colours%5B%5D.%2A%22%29%0A++append%28%22colours%5B%5D.%2A%22%2C%22+is+a+nice+color%22%29%0A++copy_field%28%22colours%5B%5D.%2A%22%2C%22result.%24append%22%29%0Aend&data=---%0Acolours%3A%0A+-+red%0A+-+yellow%0A+-+green)
+[This still transforms all elements of an array because the conditional tests all elements, not each individually.](https://metafacture.org/playground/?flux=inputFile%0A%7C+open-file%0A%7C+as-records%0A%7C+decode-yaml%0A%7C+fix%28transformationFile%29%0A%7C+encode-yaml%0A%7C+print%0A%3B&transformation=if+any_equal%28%22colours%5B%5D%22%2C%22green%22%29%0A++upcase%28%22colours%5B%5D.%2A%22%29%0A++append%28%22colours%5B%5D.%2A%22%2C%22+is+a+nice+color%22%29%0A++copy_field%28%22colours%5B%5D.%2A%22%2C%22result.%24append%22%29%0Aend&data=---%0Acolours%3A%0A+-+red%0A+-+yellow%0A+-+green)
To only tranform and copy the value `green` to an X you have to use the `do list`-Bind:
```perl
@@ -245,9 +245,9 @@ do list(path:"colours[]","var":"$i")
end
```
-[See this example here in the playground.](https://metafacture.org/playground/?flux=inputFile%0A%7C+open-file%0A%7C+as-records%0A%7C+decode-yaml%0A%7C+fix%28transformationFile%29%0A%7C+encode-yaml%0A%7C+print%0A%3B&transformation=do+list%28path%3A%22colours%5B%5D%22%2C%22var%22%3A%22%24i%22%29%0A++++if+any_equal%28%22%24i%22%2C%22green%22%29%0A++++++++upcase%28%22%24i%22%29%0A++++++++append%28%22%24i%22%2C%22+is+a+nice+color%22%29%0A++++++++copy_field%28%22%24i%22%2C%22result%5B%5D.%24append%22%29%0A++++end%0Aend&data=---%0Acolours%3A%0A+-+red%0A+-+yellow%0A+-+green)
+[See this example in the playground.](https://metafacture.org/playground/?flux=inputFile%0A%7C+open-file%0A%7C+as-records%0A%7C+decode-yaml%0A%7C+fix%28transformationFile%29%0A%7C+encode-yaml%0A%7C+print%0A%3B&transformation=do+list%28path%3A%22colours%5B%5D%22%2C%22var%22%3A%22%24i%22%29%0A++++if+any_equal%28%22%24i%22%2C%22green%22%29%0A++++++++upcase%28%22%24i%22%29%0A++++++++append%28%22%24i%22%2C%22+is+a+nice+color%22%29%0A++++++++copy_field%28%22%24i%22%2C%22result%5B%5D.%24append%22%29%0A++++end%0Aend&data=---%0Acolours%3A%0A+-+red%0A+-+yellow%0A+-+green)
-For the supported binds see: https://metafacture.github.io/metafacture-documentation/docs/fix/Fix-functions.html#binds
+See the [list of supported binds](https://metafacture.github.io/metafacture-documentation/docs/fix/Fix-functions.html#binds).
TODO: Add excercises.
diff --git a/docs/06_MetafactureCLI.md b/docs/06_MetafactureCLI.md
index 5c6b5ec..fc9ed0a 100644
--- a/docs/06_MetafactureCLI.md
+++ b/docs/06_MetafactureCLI.md
@@ -9,30 +9,40 @@ parent: Tutorial
## Get Metafacture Runner as CLI Tool
-Hint: This lesson requires basic practical knowledge of the command line and Shell.
-If you want to get familiar with tit, have a look at the great intro to Unix Shell by Library Carpentry: https://librarycarpentry.github.io/lc-shell/ (Session 1 - 3) You could also have a look at the great introdution by the Programming Historian to Powershell: https://programminghistorian.org/en/lessons/intro-to-powershell
+This lesson requires basic practical knowledge of the command line and Shell.
+If you want to get familiar with it, have a look at the great [intro to Unix Shell by Library Carpentry](https://librarycarpentry.github.io/lc-shell/) (Session 1 - 3). You could also have a look at the great [introdution by the Programming Historian to Powershell](https://programminghistorian.org/en/lessons/intro-to-powershell)_
-While we had fun with our Metafacture Playground another way to use Metafacture is
-the command line. For running a Metafacture flux process we need a terminal and installed JAVA 11 ore higher.
-For creating and editing Flux and Fix files we need an texteditor like Codium/VS Code or others.
+While we had fun with our Metafacture Playground another way to use Metafacture is by
+the command line. For running a Metafacture Flux process we need a terminal and Java 11 ore higher.
+For creating and editing Flux and Fix files we need a text editor like Codium/VS Code or others.
For this lesson basic knowledge of the commandline is recommended.
Check if Java 11 or higher is installed with `java -version` in your terminal.
-If not, install JAVA 11 or higher.
+If not, install Java 11 or higher.
-To use Metafacture on the commandline we can download the latest distribution e.g.: `metafacture-core-7.0.0-dist.zip`:
-
-[https://github.com/metafacture/metafacture-core/releases](https://github.com/metafacture/metafacture-core/releases)
-
-Hint: If 7.0.0 is not published yet use the runner version of the [prerelease 7.0.0-rc1](https://github.com/metafacture/metafacture-core/releases/tag/metafacture-core-7.0.0-rc1).
+To use Metafacture on the commandline [download the latest (pre-) release](https://github.com/metafacture/metafacture-core/releases).
Download `metafacture-core-$VERSION-dist.tar.gz` or the zip version and extract the archive to your choosen folder.
In the folder you find the `flux.bat` and `flux.sh`
-The code below assumes you moved the resulting folder to your home directory and renamed it to `'metafacture'`
+The code below assumes you moved the resulting folder to your home directory and renamed it to `"metafacture"`.
+
+If you run
+
+Unix:
-$ ~/metafacture/flux.sh # or flux.bat on Windows
+```bash
+~/metafacture/flux.sh
+```
+
+or Windows:
+
+```bash
+~\metafacture\flux.bat
+```
+
+Metafacture will list all currently available Flux Commands.
## How to run Metafacture via CLI
@@ -50,7 +60,7 @@ or Windows:
~\metafacture\flux.bat path\to\your.flux
```
-(Hint: You need to know the path to your file to run the function.)
+(Hint: You need to know the path to your Flux file to run it.)
To get quick started let's revisit a Flux we toyed around with in the playground.
The playground has a nice feature to export and import Metafacture Workflows.
@@ -77,12 +87,10 @@ Windows:
To simplify the code examples we will be using unix paths for the terminal commands. Windows Powershell will change these to windows paths automatically.
-The result of running the Flux-Script via CLI should be the same as with the Playground.
-
-The Metafacture CLI Tool expects a flux file for every workflow.
-Our runned workflow only has the following flux and no additional files since it is fetching data from the web and it has no fix transformations.
+The result of running the Flux script via CLI should be the same as with the Playground.
-The downloaded file should have the following content, defining the playground specific variables and the flux workflow that you also saw in the playground. You can delete the playground specific variables since they are not needed here.
+The Metafacture CLI tool expects a Flux file for every workflow.
+Our workflow only has the following Flux and no additional files since it is fetching data from the web and it has no Fix transformations. The file should have the following content, defining the playground specific variables and the Flux workflow that you also saw in the playground. You can delete the playground specific variables since they are not needed so you would end with this:
```text
"https://openlibrary.org/books/OL2838758M.json"
@@ -96,11 +104,11 @@ The downloaded file should have the following content, defining the playground s
## Use local files for transformation
-If you want to load a local file instead of fetching data from the web we need to change the flux a little bit with an texteditor.
-Download the following file [11942150X.json](./sample-scripts/lesson_06/11942150X.json)
+If you want to load a local file instead of fetching data from the web you need to change the Flux a little bit with an text editor.
+Download the following file [11942150X.json](../sample-scripts/lesson_06/11942150X.json)
and adjust the path to your file.
-Adjust your `downloads/playground.flux` script, so that it does not load data from the web, but opens a local file with `open-file` and read it `as-recrods` since the json file is pretty printed:
+Adjust your `downloads/playground.flux` script, so that it does not load data from the web, but opens a local file with `open-file` and reads it using `as-records` since the json file is pretty printed (not as one record per line):
```text
"path/to/your/file/11942150X.json" // Adjust your path!
@@ -116,7 +124,7 @@ Run it again as shown above.
It should output:
-```JSON
+```json
{
"professionOrOccupation" : [ {
"id" : "https://d-nb.info/gnd/4629643-8",
@@ -284,7 +292,7 @@ It should output:
}
```
-If we want to use fix we need to refrence the fix file that in the playground we only refrenced via `| fix`
+If we want to use Fix we need to reference the Fix file (in the playground we only referenced the variable `transformationFile` via `| fix`):
```text
"path/to/your/file/11942150X.json"
@@ -297,20 +305,19 @@ If we want to use fix we need to refrence the fix file that in the playground we
;
```
-Create a new file with the name `fixFile.fix`, files with fix scripts should have a `.fix` file suffix.
+Create a new file with the name `fixFile.fix`. Files with Fix scripts should have a `.fix` file suffix to easily discriminate them later.
-Add the follwoing line as content to this file:
+Add the following line as content to this file:
```perl
retain("preferredName","id","type[]")
-
```
-Save it in the same folder as the flux file. (Hint: It does not always have to be in the same folder.)
+Save it in the same folder as the Flux file. (Hint: It does not always have to be in the same folder.)
## Use variables
-Hint: You can use the varliable FLUX_DIR to shorten the file path if the file is in the same folder as the flux-file.
+Hint: You can use the varliable FLUX_DIR to shorten the file path if the file is in the same folder as the Flux file:
```text
FLUX_DIR + "file.json"
@@ -323,7 +330,7 @@ FLUX_DIR + "file.json"
;
```
-If you are using variables, that are not defined in the flux, you can pass them on with the CLI:
+If you are using variables that are not defined in the flux, you can pass them on with the CLI:
e.g.
@@ -338,7 +345,7 @@ FILE
;
```
-You could use:
+Which you use like:
```bash
~/metafacture/flux.sh path/to/your.flux FILE="path/to/your/file.json"
@@ -348,10 +355,10 @@ You could use:
Excercise: Download the following folder (TODO) with three test examples and run them. Adjust them if needed:
- Run example script locally.
-- Adjust example script so that all json files but no other in the folder are read. Get inspired by https://github.com/metafacture/metafacture-core/blob/master/metafacture-runner/src/main/dist/examples/misc/reading-dirs/read-dirs.flux.
-- Change the FLUX script so that you write the output in the local file instead of stoudt.
-- Add a fix file and add the fix module in the flux. With `nothing()` as content.
-- Add some transformations to the fix e.g. add fields.
+- Adjust example script so that all json files but no other in the folder are read. Get inspired by the [reading directories example](https://github.com/metafacture/metafacture-core/blob/master/metafacture-runner/src/main/dist/examples/misc/reading-dirs/read-dirs.flux).
+- Change the Flux script so that you write the output in the local file instead of stoudt.
+- Add a Fix file and add the Fix module in the Flux. With `nothing()` as content.
+- Add some transformations to the Fix e.g. add fields.
---------------
diff --git a/docs/07_Processing_MARC.md b/docs/07_Processing_MARC.md
index 86bfa75..1dc8439 100644
--- a/docs/07_Processing_MARC.md
+++ b/docs/07_Processing_MARC.md
@@ -8,17 +8,17 @@ parent: Tutorial
# Lesson 7: Processing MARC with Metafacture
-In the previous lessons we learned how we can use Metafacture to process structured data like JSON. Today we will use Metafacture to process MARC metadata records. In this process we will see that MARC can be processed using FIX paths.
+In the previous lessons we learned how we can use Metafacture to process structured data like JSON. In this lesson we will use Metafacture to process MARC metadata records. In this process we will see that MARC can be processed using FIX paths.
-[Transformation marc data with metafacture can be used for multiple things, e.g. you could transform marc binary files to marc xml.](https://metafacture.org/playground/?flux=%22https%3A//raw.githubusercontent.com/metafacture/metafacture-tutorial/main/data/sample.mrc%22%0A%7C+open-http%0A%7C+as-lines%0A%7C+decode-marc21%28emitleaderaswhole%3D%22true%22%29%0A%7C+encode-marcxml%0A%7C+print%0A%3B)
+[Transformation of MARC data with Metafacture can be used for multiple things, e.g. you could transform MARC binary files to MARC XML.](https://metafacture.org/playground/?flux=%22https%3A//raw.githubusercontent.com/metafacture/metafacture-tutorial/main/data/sample.mrc%22%0A%7C+open-http%0A%7C+as-lines%0A%7C+decode-marc21%28emitleaderaswhole%3D%22true%22%29%0A%7C+encode-marcxml%0A%7C+print%0A%3B)
-As always, we will need to set up a small metafacture flux script.
+As always, we will need to set up a small metafacture Flux script.
-Lets inspect a marc file: https://raw.githubusercontent.com/metafacture/metafacture-tutorial/main/data/sample.marc
+Lets inspect a MARC file: https://raw.githubusercontent.com/metafacture/metafacture-tutorial/main/data/sample.marc
-Create the following flux in a new file e.g. name it `marc1.flux`:
+Create the following Flux in a new file and name it e.g. `marc1.flux`:
-```
+```text
"https://raw.githubusercontent.com/metafacture/metafacture-tutorial/main/data/sample.mrc"
| open-http
| as-lines
@@ -26,21 +26,25 @@ Create the following flux in a new file e.g. name it `marc1.flux`:
;
```
-Run this Flux via CLI (e.g. '/path/to/your/metafix-runner' 'path/to/your/marc1.flux'`)
+Run this Flux via CLI, e.g.:
+
+```bash
+/path/to/your/metafix-runner path/to/your/marc1.flux
+```
[Or use playground.](https://metafacture.org/playground/?flux=%22https%3A//raw.githubusercontent.com/metafacture/metafacture-tutorial/main/data/sample.mrc%22%0A%7C+open-http%0A%7C+as-lines%0A%7C+print%0A%3B)
You should see something like this:
-
+
You also can try to run the examples via CLI.
-## Get to know your marc data
+## Get to know your MARC data
-Like JSON the MARC file contains structured data but the format is different. All the data is on one line, but there isn’t at first sight a clear separation between fields and values. The field/value structure there but you need to use a MARC parser to extract this information. Metafacture contains a MARC parser which can be used to interpret this file.
+Like JSON the MARC file contains structured data but the format is different. All the data is on one line, but there isn’t at first sight a clear separation between fields and values. The field/value structure is there but you need to use a MARC parser to extract this information. Metafacture has a MARC parser which can be used to interpret this file.
-Lets create a new small Flux script to transform the Marc data into YAML, name it `marc2.flux`:
+Lets create a new small Flux script to transform the MARC data into YAML, name it `marc2.flux`:
```text
"https://raw.githubusercontent.com/metafacture/metafacture-core/master/metafacture-runner/src/main/dist/examples/read/marc21/10.marc21"
@@ -52,18 +56,18 @@ Lets create a new small Flux script to transform the Marc data into YAML, name i
;
```
-Run this FLUX script with your MF Runner on the CLI.
+Run this Flux script with your MF Runner on the CLI.
[Or try it in the the playground.](https://metafacture.org/playground/?flux=%22https%3A//raw.githubusercontent.com/metafacture/metafacture-tutorial/main/data/sample.mrc%22%0A%7C+open-http%0A%7C+as-lines%0A%7C+decode-marc21%0A%7C+encode-yaml%0A%7C+print%0A%3B)
-Running it in the playground or with the commandline you will see something like this
+Running it in the playground or with the commandline you will see something like this:
-
+
Metafacture has its own decoder for Marc21 data. The structure is translated as the following: The [leader](https://www.loc.gov/marc/bibliographic/bdleader.html) can either be translated in an entity or a single element. All [control fields `00X`](https://www.loc.gov/marc/bibliographic/bd00x.html) are translated into simple string fields with name `00X`.
-All `XXX` fields starting with `100` are translated in top elements with name of the field+indice numbers e.g. element 245 1. Ind 1 and 2. Ind 2 => `24512` . Every subfield is translated in a subfield. Additionally keep in mind that repeated elements are transformed into lists.
+All `XXX` fields starting with `100` are translated in top elements with name of the field+indice numbers, e.g. element 245 1. Ind 1 and 2. Ind 2 => `24512` . Every subfield is translated in a subfield. Additionally keep in mind that repeated elements are transformed into lists.
-Lets use `list-fix-paths(count="false")` to show the pathes that are used in the records. It helps to get a overview of the records:
+Let's use `list-fix-paths(count="false")` to show the pathes that are used in the records. It helps to get an overview of the records:
```text
"https://raw.githubusercontent.com/metafacture/metafacture-core/master/metafacture-runner/src/main/dist/examples/read/marc21/10.marc21"
@@ -79,9 +83,9 @@ Lets run it.
[See in the playground.](https://metafacture.org/playground/?flux=%22https%3A//raw.githubusercontent.com/metafacture/metafacture-core/master/metafacture-runner/src/main/dist/examples/read/marc21/10.marc21%22%0A%7C+open-http%0A%7C+as-lines%0A%7C+decode-marc21%0A%7C+list-fix-paths%28count%3D%22false%22%29%0A%7C+print%0A%3B%0A)
-## Transform some marc data
+## Transform some MARC data
-We can use metafacture fix to read the _id fields of the MARC record with the retain fix we learned in the Day 6 post:
+We can use metafacture Fix to read the `_id` fields of the MARC record with the `retain` Fix we learned in the Day 6 post:
Flux:
@@ -134,11 +138,11 @@ _id: "1049752414"
[See it in the playground.](https://metafacture.org/playground/?flux=%22https%3A//raw.githubusercontent.com/metafacture/metafacture-tutorial/main/data/sample.mrc%22%0A%7C+open-http%0A%7C+as-lines%0A%7C+decode-marc21%0A%7C+fix%28%22retain%28%27_id%27%29%22%29%0A%7C+encode-yaml%0A%7C+print%0A%3B%0A)
-What is happening here? The MARC file `sample.mrc` contains more than one MARC record. For every MARC record Metafacture extracts here the `_id` field. This field is a hidden element in every record and for MARC Records it uses the value of the `001` element.
+What is happening here? The MARC file `sample.mrc` contains more than one MARC record. For every MARC record Metafacture extracts here the `_id` field. This field is a hidden element in every record and for MARC records it uses the value of the `001` element.
-Extracting data out of the MARC record itself is a bit more difficult. This is a little different than in Catmandu. As I said Metafacture has a specific marc21 decoder. Fields with their indices are translated into fields and every subfield becomes a subfield. What makes it difficult is that some fields are repeatable and some are not. (Catmandu translates the record into an array of arrays MF does not.)
+Extracting data out of the MARC record itself is a bit more difficult. This is a little bit different than in Catmandu. As said Metafacture has a specific marc21 decoder. Fields with these indices are translated into fields and every subfield becomes a subfield. What makes it difficult is that some fields are repeatable and some are not. (Catmandu translates the record into an array of arrays while MF does not.)
-You need paths of the elements to extract the data. For instance the MARC leader is usually in the first field of a MARC record. In the previous posts about paths. To keep the `leader`element we need to retain the element `leader`.
+You need paths of the elements to extract the data. For instance the MARC leader is usually in the first field of a MARC record. Look in the previous posts about paths. To keep the `leader` element we need to retain the element `leader`:
```text
"https://raw.githubusercontent.com/metafacture/metafacture-tutorial/main/data/sample.mrc"
@@ -268,9 +272,9 @@ leader:
The leader value is translated into a leader element with the subfields. You also can emit the leader as a whole string if you use `decode-marc21` with a specific option: `| decode-marc21(emitLeaderAsWhole="true")`. [See it here.](https://metafacture.org/playground/?flux=%22https%3A//raw.githubusercontent.com/metafacture/metafacture-tutorial/main/data/sample.mrc%22%0A%7C+open-http%0A%7C+as-lines%0A%7C+decode-marc21%28emitLeaderAsWhole%3D%22true%22%29%60%0A%7C+fix%28%22retain%28%27leader%27%29%22%29%0A%7C+encode-yaml%0A%7C+print%0A%3B%0A)
-To work with MARC and transform it in Metafatcture is more generic than in CATMANDU since no marc specific maps are needed. But some difficulties come with repeatable fields. This is something you usually don’t know. And you have to inspect this first.
+Transforming MARC in Metafatcture is more generic than in Catmandu since no MARC specific maps are needed. But some difficulties come with repeatable fields. This is something you usually don’t know. And you have to inspect this first.
-Here you see, a simple mapping from the element `245 any indicators $a` to a new field names `title`. To map any incicator we use the wildcard ? for each indicator so the path is: `245??.a`
+Here you see a simple mapping from the element `245 any indicators $a` to a new field named `title`. To map any incicator we use the wildcard `?` for each indicator so the path is: `245??.a`
Flux:
@@ -293,9 +297,9 @@ retain("title")
[See here in the playground.](https://metafacture.org/playground/?flux=%22https%3A//raw.githubusercontent.com/metafacture/metafacture-tutorial/main/data/sample.mrc%22%0A%7C+open-http%0A%7C+as-lines%0A%7C+decode-marc21%0A%7C+fix%28transformationFile%29%0A%7C+encode-yaml%0A%7C+print%0A%3B%0A&transformation=copy_field%28%22245%3F%3F.a%22%2C+%22title%22%29%0Aretain%28%22title%22%29)
-More elaborate mappings can be done too. I’ll show you more complete examples in the next posts. As a warming up, here is some code to extract all the record identifiers, titles and isbn numbers in a MARC file into a CSV listing (which you can open in Excel).
+More elaborate mappings can be done, too. More complete examples follow in the next posts. As a warming up, here is some code to extract all the record identifiers, titles and isbn numbers of a MARC file into a CSV table (which you can open in Excel).
-Step 1, create a fix file `transformationFile.fix` containing:
+Step 1: create the Fix file `transformationFile.fix`, containing:
```perl
copy_field("001","id")
@@ -312,9 +316,9 @@ join_field(isbn,",")
retain("id","title","isbn")
```
-HINT: Sometimes it makes sense to create an empty array by `add_array` or an empty hash/object by `add_hash` before adding content to the array or hash. The is depending to the use-cases. In our case we need empty values if no field is mapped for the csv.
+HINT: Sometimes it makes sense to create an empty array by `add_array` or an empty hash/object by `add_hash` before adding content to the array or hash. This depends on the use-cases. In our case we need empty values if no field is mapped for the CSV.
- Step 2, create the flux workflow and execute this worklow either with CLI or the playground:
+Step 2: create the Flux workflow and execute this workflow either with CLI or the playground:
```text
"https://raw.githubusercontent.com/metafacture/metafacture-core/master/metafacture-runner/src/main/dist/examples/read/marc21/10.marc21"
@@ -347,23 +351,23 @@ You will see this as output:
"1080278184","Renfro Valley Kentucky Rainer H. Schmeissner",""
```
-In the fix above we mapped the 245-field to the title, and iterated over every subfield with the help of the list-bind and the `?`- wildcard. The ISBN is in the 020-field. Because MARC records can contain one or more 020 fields we created an isbn array with add_array and added the values using the isbn.$append syntax. Next we turned the isbn array back into a comma separated string using the join_field fix. As last step we deleted all the fields we didn’t need in the output with the `retain` syntax.
+In the Fix above we mapped the field 245 to the title, and iterated over every subfield with the help of the list-bind and the `?`- wildcard. The ISBN is in the 020-field. Because MARC records can contain one or more 020 fields we created an isbn array with add_array and added the values using the isbn.$append syntax. Next we turned the isbn array back into a comma separated string using the join_field fix. As last step we deleted all the fields we didn’t need in the output with the `retain` syntax.
-Different versions of MARC-Serialization need different workflows: e.g. h[ere see an example of Aseq-Marc Files that are transformed to marcxml.](https://test.metafacture.org/playground/?flux=%22https%3A//raw.githubusercontent.com/LibreCat/Catmandu-MARC/dev/t/rug01.aleph%22%0A%7C+open-http%0A%7C+as-lines%0A%7C+decode-aseq%0A%7C+merge-same-ids%0A%7C+encode-marcxml%0A%7C+print%0A%3B)
+Different versions of MARC-Serialization need different workflows: e.g. [here see an example of Aseq-MARC Files that are transformed to MARCxml.](https://test.metafacture.org/playground/?flux=%22https%3A//raw.githubusercontent.com/LibreCat/Catmandu-MARC/dev/t/rug01.aleph%22%0A%7C+open-http%0A%7C+as-lines%0A%7C+decode-aseq%0A%7C+merge-same-ids%0A%7C+encode-marcxml%0A%7C+print%0A%3B)
-In this post we demonstrated how to process MARC data. In the next post we will show some examples how catmandu typically can be used to process library data.
+In this post we demonstrated how to process MARC data. In the next post we will show some examples how Catmandu typically can be used to process library data.
-## Excercise.
+## Excercise
-Try to fetch some data from GND or other MARC XML resource and use the flux command `list-fix-paths` on them, e.g.: https://d-nb.info/1351874063/about/marcxml
+Try to fetch some data from GND or an other MARC XML resource and use the Flux command `list-fix-paths` on them, e.g.: https://d-nb.info/1351874063/about/marcxml.
-[In context of changes in cataloguing rules to RDA the element 260 is no used anymore, `move_field` the content of 260 to 264.](https://metafacture.org/playground/?flux=inputFile%0A%7C+open-file%0A%7C+decode-xml%0A%7C+handle-marcxml%0A%7C+fix%28transformationFile%29%0A%7C+encode-marcxml%0A%7C+print%0A%3B&transformation=&data=%3C%3Fxml+version%3D%221.0%22+encoding%3D%22UTF-8%22%3F%3E%0A++%3Crecord+xmlns%3D%22http%3A//www.loc.gov/MARC21/slim%22+type%3D%22Bibliographic%22%3E%0A++++%3Cleader%3E00000nam+a2200000uc+4500%3C/leader%3E%0A++++%3Ccontrolfield+tag%3D%22001%22%3E1191316114%3C/controlfield%3E%0A++++%3Ccontrolfield+tag%3D%22003%22%3EDE-101%3C/controlfield%3E%0A++++%3Ccontrolfield+tag%3D%22005%22%3E20210617171509.0%3C/controlfield%3E%0A++++%3Ccontrolfield+tag%3D%22007%22%3Ecr%7C%7C%7C%7C%7C%7C%7C%7C%7C%7C%7C%7C%3C/controlfield%3E%0A++++%3Ccontrolfield+tag%3D%22008%22%3E190724s2010++++gw+%7C%7C%7C%7C%7Co%7C%7C%7C%7C+00%7C%7C%7C%7Ceng++%3C/controlfield%3E%0A++++%3Cdatafield+tag%3D%22041%22+ind1%3D%22+%22+ind2%3D%22+%22%3E%0A++++++%3Csubfield+code%3D%22a%22%3Eeng%3C/subfield%3E%0A++++%3C/datafield%3E%0A++++%3Cdatafield+tag%3D%22100%22+ind1%3D%221%22+ind2%3D%22+%22%3E%0A++++++%3Csubfield+code%3D%220%22%3Ehttps%3A//d-nb.info/gnd/142627097%3C/subfield%3E%0A++++++%3Csubfield+code%3D%22a%22%3EBorgman%2C+Christine+L.%3C/subfield%3E%0A++++++%3Csubfield+code%3D%224%22%3Eaut%3C/subfield%3E%0A++++++%3Csubfield+code%3D%222%22%3Egnd%3C/subfield%3E%0A++++%3C/datafield%3E%0A++++%3Cdatafield+tag%3D%22245%22+ind1%3D%221%22+ind2%3D%220%22%3E%0A++++++%3Csubfield+code%3D%22a%22%3EResearch+Data%3A+who+will+share+what%2C+with+whom%2C+when%2C+and+why%3F%3C/subfield%3E%0A++++%3C/datafield%3E%0A++++%3Cdatafield+tag%3D%22260%22+ind1%3D%22+%22+ind2%3D%22+%22%3E%0A++++++%3Csubfield+code%3D%22a%22%3EBerlin%3C/subfield%3E%0A++++++%3Csubfield+code%3D%22c%22%3E2010%3C/subfield%3E%0A++++%3C/datafield%3E%0A++++%3Cdatafield+tag%3D%22300%22+ind1%3D%22+%22+ind2%3D%22+%22%3E%0A++++++%3Csubfield+code%3D%22a%22%3EOnline-Ressource%2C+21+S.%3C/subfield%3E%0A++++%3C/datafield%3E%0A++%3C/record%3E%0A)
+- [In context of changes in cataloguing rules to RDA the element 260 is not used anymore, `move_field` the content of 260 to 264.](https://metafacture.org/playground/?flux=inputFile%0A%7C+open-file%0A%7C+decode-xml%0A%7C+handle-marcxml%0A%7C+fix%28transformationFile%29%0A%7C+encode-marcxml%0A%7C+print%0A%3B&transformation=&data=%3C%3Fxml+version%3D%221.0%22+encoding%3D%22UTF-8%22%3F%3E%0A++%3Crecord+xmlns%3D%22http%3A//www.loc.gov/MARC21/slim%22+type%3D%22Bibliographic%22%3E%0A++++%3Cleader%3E00000nam+a2200000uc+4500%3C/leader%3E%0A++++%3Ccontrolfield+tag%3D%22001%22%3E1191316114%3C/controlfield%3E%0A++++%3Ccontrolfield+tag%3D%22003%22%3EDE-101%3C/controlfield%3E%0A++++%3Ccontrolfield+tag%3D%22005%22%3E20210617171509.0%3C/controlfield%3E%0A++++%3Ccontrolfield+tag%3D%22007%22%3Ecr%7C%7C%7C%7C%7C%7C%7C%7C%7C%7C%7C%7C%3C/controlfield%3E%0A++++%3Ccontrolfield+tag%3D%22008%22%3E190724s2010++++gw+%7C%7C%7C%7C%7Co%7C%7C%7C%7C+00%7C%7C%7C%7Ceng++%3C/controlfield%3E%0A++++%3Cdatafield+tag%3D%22041%22+ind1%3D%22+%22+ind2%3D%22+%22%3E%0A++++++%3Csubfield+code%3D%22a%22%3Eeng%3C/subfield%3E%0A++++%3C/datafield%3E%0A++++%3Cdatafield+tag%3D%22100%22+ind1%3D%221%22+ind2%3D%22+%22%3E%0A++++++%3Csubfield+code%3D%220%22%3Ehttps%3A//d-nb.info/gnd/142627097%3C/subfield%3E%0A++++++%3Csubfield+code%3D%22a%22%3EBorgman%2C+Christine+L.%3C/subfield%3E%0A++++++%3Csubfield+code%3D%224%22%3Eaut%3C/subfield%3E%0A++++++%3Csubfield+code%3D%222%22%3Egnd%3C/subfield%3E%0A++++%3C/datafield%3E%0A++++%3Cdatafield+tag%3D%22245%22+ind1%3D%221%22+ind2%3D%220%22%3E%0A++++++%3Csubfield+code%3D%22a%22%3EResearch+Data%3A+who+will+share+what%2C+with+whom%2C+when%2C+and+why%3F%3C/subfield%3E%0A++++%3C/datafield%3E%0A++++%3Cdatafield+tag%3D%22260%22+ind1%3D%22+%22+ind2%3D%22+%22%3E%0A++++++%3Csubfield+code%3D%22a%22%3EBerlin%3C/subfield%3E%0A++++++%3Csubfield+code%3D%22c%22%3E2010%3C/subfield%3E%0A++++%3C/datafield%3E%0A++++%3Cdatafield+tag%3D%22300%22+ind1%3D%22+%22+ind2%3D%22+%22%3E%0A++++++%3Csubfield+code%3D%22a%22%3EOnline-Ressource%2C+21+S.%3C/subfield%3E%0A++++%3C/datafield%3E%0A++%3C/record%3E%0A)
-- [In a publishers provided metadata `264 .c` has a prefix `c` for copyright, this is unnecessary. Delete the c e.g. by using `replace_all`.](https://metafacture.org/playground/?flux=inputFile%0A%7C+open-file%0A%7C+decode-xml%0A%7C+handle-marcxml%0A%7C+fix%28transformationFile%29%0A%7C+encode-marcxml%0A%7C+print%0A%3B&transformation=%23+ersetze+im+Feld+264+c+das+%22c%22+durch+nichts+%22%22%0A%23+Hilfestellung+Pfad+zum+Unterfeld%3A++%22264++.c%22%0A&data=%3C%3Fxml+version%3D%221.0%22+encoding%3D%22UTF-8%22%3F%3E%0A++%3Crecord+xmlns%3D%22http%3A//www.loc.gov/MARC21/slim%22+type%3D%22Bibliographic%22%3E%0A++++%3Cleader%3E00000nam+a2200000uc+4500%3C/leader%3E%0A++++%3Ccontrolfield+tag%3D%22001%22%3E1191316114%3C/controlfield%3E%0A++++%3Ccontrolfield+tag%3D%22003%22%3EDE-101%3C/controlfield%3E%0A++++%3Ccontrolfield+tag%3D%22005%22%3E20210617171509.0%3C/controlfield%3E%0A++++%3Ccontrolfield+tag%3D%22007%22%3Ecr%7C%7C%7C%7C%7C%7C%7C%7C%7C%7C%7C%7C%3C/controlfield%3E%0A++++%3Ccontrolfield+tag%3D%22008%22%3E190724s2010++++gw+%7C%7C%7C%7C%7Co%7C%7C%7C%7C+00%7C%7C%7C%7Ceng++%3C/controlfield%3E%0A++++%3Cdatafield+tag%3D%22041%22+ind1%3D%22+%22+ind2%3D%22+%22%3E%0A++++++%3Csubfield+code%3D%22a%22%3Eeng%3C/subfield%3E%0A++++%3C/datafield%3E%0A++++%3Cdatafield+tag%3D%22100%22+ind1%3D%221%22+ind2%3D%22+%22%3E%0A++++++%3Csubfield+code%3D%220%22%3Ehttps%3A//d-nb.info/gnd/142627097%3C/subfield%3E%0A++++++%3Csubfield+code%3D%22a%22%3EBorgman%2C+Christine+L.%3C/subfield%3E%0A++++++%3Csubfield+code%3D%224%22%3Eaut%3C/subfield%3E%0A++++++%3Csubfield+code%3D%222%22%3Egnd%3C/subfield%3E%0A++++%3C/datafield%3E%0A++++%3Cdatafield+tag%3D%22245%22+ind1%3D%221%22+ind2%3D%220%22%3E%0A++++++%3Csubfield+code%3D%22a%22%3EResearch+Data%3A+who+will+share+what%2C+with+whom%2C+when%2C+and+why%3F%3C/subfield%3E%0A++++%3C/datafield%3E%0A++++%3Cdatafield+tag%3D%22264%22+ind1%3D%22+%22+ind2%3D%22+%22%3E%0A++++++%3Csubfield+code%3D%22a%22%3EBerlin%3C/subfield%3E%0A++++++%3Csubfield+code%3D%22c%22%3Ec2010%3C/subfield%3E%0A++++%3C/datafield%3E%0A++++%3Cdatafield+tag%3D%22300%22+ind1%3D%22+%22+ind2%3D%22+%22%3E%0A++++++%3Csubfield+code%3D%22a%22%3EOnline-Ressource%2C+21+S.%3C/subfield%3E%0A++++%3C/datafield%3E%0A++%3C/record%3E%0A)
+- [In a publisher's provided metadata `264 .c` has a prefix `c` for copyright, this is unnecessary. Delete the c e.g. by using `replace_all`.](https://metafacture.org/playground/?flux=inputFile%0A%7C+open-file%0A%7C+decode-xml%0A%7C+handle-marcxml%0A%7C+fix%28transformationFile%29%0A%7C+encode-marcxml%0A%7C+print%0A%3B&transformation=%23+ersetze+im+Feld+264+c+das+%22c%22+durch+nichts+%22%22%0A%23+Hilfestellung+Pfad+zum+Unterfeld%3A++%22264++.c%22%0A&data=%3C%3Fxml+version%3D%221.0%22+encoding%3D%22UTF-8%22%3F%3E%0A++%3Crecord+xmlns%3D%22http%3A//www.loc.gov/MARC21/slim%22+type%3D%22Bibliographic%22%3E%0A++++%3Cleader%3E00000nam+a2200000uc+4500%3C/leader%3E%0A++++%3Ccontrolfield+tag%3D%22001%22%3E1191316114%3C/controlfield%3E%0A++++%3Ccontrolfield+tag%3D%22003%22%3EDE-101%3C/controlfield%3E%0A++++%3Ccontrolfield+tag%3D%22005%22%3E20210617171509.0%3C/controlfield%3E%0A++++%3Ccontrolfield+tag%3D%22007%22%3Ecr%7C%7C%7C%7C%7C%7C%7C%7C%7C%7C%7C%7C%3C/controlfield%3E%0A++++%3Ccontrolfield+tag%3D%22008%22%3E190724s2010++++gw+%7C%7C%7C%7C%7Co%7C%7C%7C%7C+00%7C%7C%7C%7Ceng++%3C/controlfield%3E%0A++++%3Cdatafield+tag%3D%22041%22+ind1%3D%22+%22+ind2%3D%22+%22%3E%0A++++++%3Csubfield+code%3D%22a%22%3Eeng%3C/subfield%3E%0A++++%3C/datafield%3E%0A++++%3Cdatafield+tag%3D%22100%22+ind1%3D%221%22+ind2%3D%22+%22%3E%0A++++++%3Csubfield+code%3D%220%22%3Ehttps%3A//d-nb.info/gnd/142627097%3C/subfield%3E%0A++++++%3Csubfield+code%3D%22a%22%3EBorgman%2C+Christine+L.%3C/subfield%3E%0A++++++%3Csubfield+code%3D%224%22%3Eaut%3C/subfield%3E%0A++++++%3Csubfield+code%3D%222%22%3Egnd%3C/subfield%3E%0A++++%3C/datafield%3E%0A++++%3Cdatafield+tag%3D%22245%22+ind1%3D%221%22+ind2%3D%220%22%3E%0A++++++%3Csubfield+code%3D%22a%22%3EResearch+Data%3A+who+will+share+what%2C+with+whom%2C+when%2C+and+why%3F%3C/subfield%3E%0A++++%3C/datafield%3E%0A++++%3Cdatafield+tag%3D%22264%22+ind1%3D%22+%22+ind2%3D%22+%22%3E%0A++++++%3Csubfield+code%3D%22a%22%3EBerlin%3C/subfield%3E%0A++++++%3Csubfield+code%3D%22c%22%3Ec2010%3C/subfield%3E%0A++++%3C/datafield%3E%0A++++%3Cdatafield+tag%3D%22300%22+ind1%3D%22+%22+ind2%3D%22+%22%3E%0A++++++%3Csubfield+code%3D%22a%22%3EOnline-Ressource%2C+21+S.%3C/subfield%3E%0A++++%3C/datafield%3E%0A++%3C/record%3E%0A)
-[- Fetch all ISBNs from field 020 and write them in an json array element callend `isbn\[\]` and only retain this element.](https://metafacture.org/playground/?flux=inputFile%0A%7Copen-file%0A%7Cdecode-xml%0A%7Chandle-marcxml%0A%7Cfix%28transformationFile%29%0A%7Cencode-json%28prettyPrinting%3D%22true%22%29%0A%7Cprint%0A%3B&transformation=&data=%3C%3Fxml+version%3D%221.0%22+encoding%3D%22UTF-8%22%3F%3E%0A++%3Crecord+xmlns%3D%22http%3A//www.loc.gov/MARC21/slim%22+type%3D%22Bibliographic%22%3E%0A++++%3Cleader%3E00000nam+a2200000+c+4500%3C/leader%3E%0A++++%3Ccontrolfield+tag%3D%22008%22%3E190712%7C2020%23%23%23%23xxu%23%23%23%23%23%23%23%23%23%23%23%7C%7C%7C%23%7C%23eng%23c%3C/controlfield%3E%0A++++%3Ccontrolfield+tag%3D%22001%22%3E990363239750206441%3C/controlfield%3E%0A++++%3Cdatafield+tag%3D%22020%22+ind1%3D%22+%22+ind2%3D%22+%22%3E%3Csubfield+code%3D%22a%22%3E9781138393295%3C/subfield%3E%3Csubfield+code%3D%22c%22%3Epaperback%3C/subfield%3E%3C/datafield%3E%0A++++%3Cdatafield+tag%3D%22020%22+ind1%3D%22+%22+ind2%3D%22+%22%3E%3Csubfield+code%3D%22a%22%3E9780367260934%3C/subfield%3E%3Csubfield+code%3D%22c%22%3Ehardback%3C/subfield%3E%3Csubfield+code%3D%229%22%3E978036726093-4%3C/subfield%3E%3C/datafield%3E%0A++++%3Cdatafield+tag%3D%22041%22+ind1%3D%22+%22+ind2%3D%22+%22%3E%3Csubfield+code%3D%22a%22%3Eeng%3C/subfield%3E%3C/datafield%3E%0A++++%3Cdatafield+tag%3D%22100%22+ind1%3D%221%22+ind2%3D%22+%22%3E%3Csubfield+code%3D%22a%22%3EMatloff%2C+Norman+S.%3C/subfield%3E%3Csubfield+code%3D%22d%22%3E1948-%3C/subfield%3E%3Csubfield+code%3D%220%22%3E%28DE-588%291018956115%3C/subfield%3E%3Csubfield+code%3D%224%22%3Eaut%3C/subfield%3E%3Csubfield+code%3D%220%22%3Ehttps%3A//d-nb.info/gnd/1018956115%3C/subfield%3E%3Csubfield+code%3D%220%22%3Ehttp%3A//viaf.org/viaf/65542823%3C/subfield%3E%3Csubfield+code%3D%22B%22%3EGND-1018956115%3C/subfield%3E%3C/datafield%3E%0A++++%3Cdatafield+tag%3D%22245%22+ind1%3D%221%22+ind2%3D%220%22%3E%3Csubfield+code%3D%22a%22%3EProbability+and+statistics+for+data+science%3C/subfield%3E%3Csubfield+code%3D%22b%22%3EMath+%2B+R+%2B+Data%3C/subfield%3E%3Csubfield+code%3D%22c%22%3ENorman+Matloff%3C/subfield%3E%3C/datafield%3E%0A++++%3Cdatafield+tag%3D%22264%22+ind1%3D%22+%22+ind2%3D%221%22%3E%3Csubfield+code%3D%22a%22%3EBoca+Raton+%3B+London+%3B+New+York%3C/subfield%3E%3Csubfield+code%3D%22b%22%3ECRC+Press%3C/subfield%3E%3Csubfield+code%3D%22c%22%3E%5B2020%5D%3C/subfield%3E%3C/datafield%3E%0A++++%3Cdatafield+tag%3D%22300%22+ind1%3D%22+%22+ind2%3D%22+%22%3E%3Csubfield+code%3D%22a%22%3Exxxii%2C+412+Seiten%3C/subfield%3E%3Csubfield+code%3D%22b%22%3EDiagramme%3C/subfield%3E%3Csubfield+code%3D%22c%22%3E24+cm%3C/subfield%3E%3C/datafield%3E%0A++%3C/record%3E)
+- [Fetch all ISBNs from field 020 and write them in an JSON array element callend `isbn\[\]` and only retain this element.](https://metafacture.org/playground/?flux=inputFile%0A%7Copen-file%0A%7Cdecode-xml%0A%7Chandle-marcxml%0A%7Cfix%28transformationFile%29%0A%7Cencode-json%28prettyPrinting%3D%22true%22%29%0A%7Cprint%0A%3B&transformation=&data=%3C%3Fxml+version%3D%221.0%22+encoding%3D%22UTF-8%22%3F%3E%0A++%3Crecord+xmlns%3D%22http%3A//www.loc.gov/MARC21/slim%22+type%3D%22Bibliographic%22%3E%0A++++%3Cleader%3E00000nam+a2200000+c+4500%3C/leader%3E%0A++++%3Ccontrolfield+tag%3D%22008%22%3E190712%7C2020%23%23%23%23xxu%23%23%23%23%23%23%23%23%23%23%23%7C%7C%7C%23%7C%23eng%23c%3C/controlfield%3E%0A++++%3Ccontrolfield+tag%3D%22001%22%3E990363239750206441%3C/controlfield%3E%0A++++%3Cdatafield+tag%3D%22020%22+ind1%3D%22+%22+ind2%3D%22+%22%3E%3Csubfield+code%3D%22a%22%3E9781138393295%3C/subfield%3E%3Csubfield+code%3D%22c%22%3Epaperback%3C/subfield%3E%3C/datafield%3E%0A++++%3Cdatafield+tag%3D%22020%22+ind1%3D%22+%22+ind2%3D%22+%22%3E%3Csubfield+code%3D%22a%22%3E9780367260934%3C/subfield%3E%3Csubfield+code%3D%22c%22%3Ehardback%3C/subfield%3E%3Csubfield+code%3D%229%22%3E978036726093-4%3C/subfield%3E%3C/datafield%3E%0A++++%3Cdatafield+tag%3D%22041%22+ind1%3D%22+%22+ind2%3D%22+%22%3E%3Csubfield+code%3D%22a%22%3Eeng%3C/subfield%3E%3C/datafield%3E%0A++++%3Cdatafield+tag%3D%22100%22+ind1%3D%221%22+ind2%3D%22+%22%3E%3Csubfield+code%3D%22a%22%3EMatloff%2C+Norman+S.%3C/subfield%3E%3Csubfield+code%3D%22d%22%3E1948-%3C/subfield%3E%3Csubfield+code%3D%220%22%3E%28DE-588%291018956115%3C/subfield%3E%3Csubfield+code%3D%224%22%3Eaut%3C/subfield%3E%3Csubfield+code%3D%220%22%3Ehttps%3A//d-nb.info/gnd/1018956115%3C/subfield%3E%3Csubfield+code%3D%220%22%3Ehttp%3A//viaf.org/viaf/65542823%3C/subfield%3E%3Csubfield+code%3D%22B%22%3EGND-1018956115%3C/subfield%3E%3C/datafield%3E%0A++++%3Cdatafield+tag%3D%22245%22+ind1%3D%221%22+ind2%3D%220%22%3E%3Csubfield+code%3D%22a%22%3EProbability+and+statistics+for+data+science%3C/subfield%3E%3Csubfield+code%3D%22b%22%3EMath+%2B+R+%2B+Data%3C/subfield%3E%3Csubfield+code%3D%22c%22%3ENorman+Matloff%3C/subfield%3E%3C/datafield%3E%0A++++%3Cdatafield+tag%3D%22264%22+ind1%3D%22+%22+ind2%3D%221%22%3E%3Csubfield+code%3D%22a%22%3EBoca+Raton+%3B+London+%3B+New+York%3C/subfield%3E%3Csubfield+code%3D%22b%22%3ECRC+Press%3C/subfield%3E%3Csubfield+code%3D%22c%22%3E%5B2020%5D%3C/subfield%3E%3C/datafield%3E%0A++++%3Cdatafield+tag%3D%22300%22+ind1%3D%22+%22+ind2%3D%22+%22%3E%3Csubfield+code%3D%22a%22%3Exxxii%2C+412+Seiten%3C/subfield%3E%3Csubfield+code%3D%22b%22%3EDiagramme%3C/subfield%3E%3Csubfield+code%3D%22c%22%3E24+cm%3C/subfield%3E%3C/datafield%3E%0A++%3C/record%3E)
-- [Create a list with all identifiers in `001` without elementnames or occurence for, you could use the flux command `list-fix-values` with the option `count="false"`.](https://metafacture.org/playground/?flux=%22https%3A//raw.githubusercontent.com/metafacture/metafacture-core/master/metafacture-runner/src/main/dist/examples/read/marc21/10.marc21%22%0A%7C+open-http%0A%7C+as-lines%0A%7C+decode-marc21%0A%7C+...%0A%7C+print%0A%3B)
+- [Create a list with all identifiers in `001` without element names or occurence. You could use the flux command `list-fix-values` with the option `count="false"`.](https://metafacture.org/playground/?flux=%22https%3A//raw.githubusercontent.com/metafacture/metafacture-core/master/metafacture-runner/src/main/dist/examples/read/marc21/10.marc21%22%0A%7C+open-http%0A%7C+as-lines%0A%7C+decode-marc21%0A%7C+...%0A%7C+print%0A%3B)
- TODO: Add an example for a list bind or an conditional.
diff --git a/docs/08_Harvest_data_with_OAI-PMH.md b/docs/08_Harvest_data_with_OAI-PMH.md
index 1068378..a41c2a7 100644
--- a/docs/08_Harvest_data_with_OAI-PMH.md
+++ b/docs/08_Harvest_data_with_OAI-PMH.md
@@ -10,22 +10,20 @@ parent: Tutorial
The Open Archives Initiative Protocol for Metadata Harvesting (OAI-PMH) is a protocol to harvest metadata records from OAI compliant repositories. It was developed by the Open Archives Initiative as a low-barrier mechanism for repository interoperability. The Open Archives Initiative maintains a registry of OAI data providers.
-Metafacture provides an opener flux module for harvesting metadata from OAI-PMH: `open-oaipmh`
+Metafacture provides a Flux module for harvesting metadata from OAI-PMH: `open-oaipmh`.
Lets have a look at the documentation of open-oaipmh:

-There you see the specific options that can be used to configure your OAI PMH Harvesting.
+You see the specific options that can be used to configure your OAI PMH harvesting.
-Every OAI server must provide metadata records in Dublin Core, other (bibliographic) formats like MARC may be supported additionally. Available metadata formats can be detected with the OAI verb `ListMetadataFormats`: https://lib.ugent.be/oai?verb=ListMetadataFormats
-
-This OAI-PMH API provides MODS and Dublin Core. For specifying the metadataformat you use the `metadataprefix:` Option.
+Every OAI server must provide metadata records in Dublin Core, other (bibliographic) formats like MARC may be supported additionally. Available metadata formats can be detected with the OAI verb `ListMetadataFormats`, [see an example](https://lib.ugent.be/oai?verb=ListMetadataFormats) which provides MODS and Dublin Core. For specifying the metadata format use the `metadataprefix` option.
The OAI server may support selective harvesting, so OAI clients can get only subsets of records from a repository.
-The client requests could be limited via datestamps (`datefrom`, `dateuntil`) or set membership (`setSpec`).
+The client requests could be limited via datestamps (`datefrom`, `dateuntil`) or by setting the membership (`setSpec`).
-To get some Dublin Core records from the collection of Ghent University Library and convert it to JSON (default) run the following Metafacture worklow via Playground or CLI:
+To get some Dublin Core records from the collection of Ghent University Library and convert it to JSON (default) run the following Metafacture workflow via Playground or CLI:
```text
"https://lib.ugent.be/oai"
@@ -37,9 +35,9 @@ To get some Dublin Core records from the collection of Ghent University Library
;
```
-But if you just want to use the specific metadata records and not the oai-pmh specific metadata wrappers then specify the xml handler like this: `| handle-generic-xml(recordtagname="dc")`
+If you just want to use the specific metadata records and not the OAI-PMH specific metadata wrappers then specify the XML handler like this: `| handle-generic-xml(recordtagname="dc")`
-You can also harvest MARC data, serialze it to marc-binary and store it in a file:
+You can also harvest MARC data, serialize it to MARC-binary and store it in a file:
```text
"https://lib.ugent.be/oai"
@@ -51,7 +49,7 @@ You can also harvest MARC data, serialze it to marc-binary and store it in a fil
;
```
-You can also transform incoming data and immediately store/index it with MongoDB or Elasticsearch. For the transformation you need to create a fix (see Lesson 3) in the playground or in a text editor:
+You can also transform incoming data and prepare it for indexing it with Elasticsearch. For the transformation you need to create a fix (see Lesson 3) in the playground or in a text editor:
Add the following fixes to the file:
@@ -63,7 +61,7 @@ copy_field("260??.c","date")
retain("_id","title","creator[]","date")
```
-Now you can run an ETL process (extract, transform, load) with this worklflow:
+Now you can run an ETL process (extract, transform, load) with this worklflow, we use `json-to-elasticsearch-bulk` to prepare the output for elastic search indexing:
```text
"https://lib.ugent.be/oai"
@@ -77,7 +75,7 @@ Now you can run an ETL process (extract, transform, load) with this worklflow:
;
```
-Excercise: Try to fetch data from a OAI-PMH you know. (e.g. the [DNB OAI](https://www.dnb.de/DE/Professionell/Metadatendienste/Datenbezug/OAI/oai_node.html))
+Excercise: Try to fetch data from an OAI-PMH you know. (e.g. the [DNB OAI](https://www.dnb.de/DE/Professionell/Metadatendienste/Datenbezug/OAI/oai_node.html))
---------------
diff --git a/docs/09_Working_with_CSV.md b/docs/09_Working_with_CSV.md
index 1c7767d..9d7b7d8 100644
--- a/docs/09_Working_with_CSV.md
+++ b/docs/09_Working_with_CSV.md
@@ -8,13 +8,13 @@ parent: Tutorial
# Lesson 9: Working with CSV and TSV files
-CSV and TSV files are widely-used to store and exchange simple structured data. Many open datasets are published as CSV or TSV files, e.g. datahub.io. Within the library community CSV files are used for the distribution of title lists (KBART), e.g Knowledge Base+.
+CSV and TSV files are widely-used to store and exchange simple structured data. Many open datasets are published as CSV or TSV files, see e.g. datahub.io. Within the library community CSV files are used for the distribution of title lists (KBART), e.g Knowledge Base+.
-Metafacture implements an decoder and encoder for both formats: decode-csv and encode-csv.
+Metafacture implements a decoder and an encoder which you can youse for both formats: `decode-csv` and `encode-csv`.
## Reading CSVs
-So get some CSV data to work with:
+Get some CSV data to work with:
```text
"https://lib.ugent.be/download/librecat/data/goodreads.csv"
@@ -24,9 +24,9 @@ So get some CSV data to work with:
;
```
-It shows a CSV file with a header row at the beginnung.
+It shows a CSV file with a header row at the beginning.
-Now you can convert the data to different formats, like JSON, YAML and XML by decoding the data as csv and encoding it in the desired format:
+Convert the data to different serializations, like JSON, YAML and XML by decoding the data as CSV and encoding it in the desired serialization:
```
"https://lib.ugent.be/download/librecat/data/goodreads.csv"
@@ -40,11 +40,10 @@ Now you can convert the data to different formats, like JSON, YAML and XML by de
[See in playground.](https://metafacture.org/playground/?flux=%22https%3A//lib.ugent.be/download/librecat/data/goodreads.csv%22%0A%7C+open-http%0A%7C+as-lines%0A%7C+decode-csv%0A%7C+encode-json%28prettyPrinting%3D%22true%22%29+//+or+encode-xml+or+encode-yaml%0A%7C+print%0A%3B)
-See that the elements have no name literal names but are only numbers.
-But the csv has a header we need to add the option `(hasHeader="true")` to `decode-csv` in the flux.
+See that the elements have no literal names but only numbers.
+As the CSV has a header we need to add the option `(hasHeader="true")` to `decode-csv` in the Flux.
-
-You can extract specified fields while converting to another tabular format by using the fix. This is quite handy for analysis of specific fields or to generate reports. In the following example we only keep three columns (`ISBN"`,`"Title"`,`"Author"`):
+You can extract specified fields while converting to another tabular format by using the Fix. This is quite handy for analysis of specific fields or to generate reports. In the following example we only keep three columns (`"ISBN"`,`"Title"`,`"Author"`):
Flux:
@@ -60,13 +59,14 @@ Flux:
```
With Fix:
-```
+
+```perl
retain("ISBN","Title","Author")
```
[See the example in the Playground](https://metafacture.org/playground/?flux=%22https%3A//lib.ugent.be/download/librecat/data/goodreads.csv%22%0A%7C+open-http%0A%7C+as-lines%0A%7C+decode-csv%28hasHeader%3D%22true%22%29%0A%7C+fix%28transformationFile%29%0A%7C+encode-csv%28includeHeader%3D%22true%22%29%0A%7C+print%0A%3B&transformation=retain%28%22ISBN%22%2C%22Title%22%2C%22Author%22%29)
-By default Metafactures `decode-csv` expects that CSV fields are separated by comma ‘,’ and strings are quoted with double qoutes ‘”‘ or single quotes `'`. You can specify other characters as separator or quotes with the option ‘separator’ and clean special quote signs with the fix. (In contrast to Catmandu quote-chars cannot be manipulated by the decoder directly, yet.)
+By default Metafactures `decode-csv` expects that CSV fields are separated by comma `,` and strings are quoted with double qoutes `"` or single quotes `'`. You can specify other characters as separator or quotes with the option `separator` and clean special quote signs using the Fix. (In contrast to Catmandu quote-chars cannot be manipulated by the decoder directly, yet.)
Flux:
@@ -82,17 +82,15 @@ Flux:
Fix:
-```
+```perl
replace_all("?","^\\$|\\$$","")
```
[See the example in the Playground.](https://metafacture.org/playground/?flux=%2212157%3B%24The+Journal+of+Headache+and+Pain%24%3B2193-1801%22%0A%7C+read-string%0A%7C+as-lines%0A%7C+decode-csv%28separator%3D%22%3B%22%29%0A%7C+fix%28transformationFile%29%0A%7C+encode-csv%28separator%3D%22\t%22%2C+includeheader%3D%22true%22%29%0A%7C+print%3B&transformation=replace_all%28%22%3F%22%2C%22%5E\\%24%7C\\%24%24%22%2C%22%22%29)
-In the example above we read the string as a little CSV fragment using the `read-string` command for our small test. It will read the tiny CSV string which uses “;” and “$” as separation and quotation characters.
+In the example above we read the string as a little CSV fragment using the `read-string` command for our small test. It will read the tiny CSV string which uses `;` and `$` as separation and quotation characters.
The string is then read each line by `as-lines` and decoded as csv with the separator `,`.
-With a little fix you can
-
## Writing CSVs
When harvesting data in tabular format you also can change the field names in the header or omit the header:
@@ -121,7 +119,7 @@ retain("A","B","C")
[See example in he playground.](https://metafacture.org/playground/?flux=%22https%3A//lib.ugent.be/download/librecat/data/goodreads.csv%22%0A%7C+open-http%0A%7C+as-lines%0A%7C+decode-csv%28hasheader%3D%22true%22%29%0A%7C+fix%28transformationFile%29%0A%7C+encode-csv%28includeHeader%3D%22true%22%29%0A%7C+print%3B&transformation=move_field%28%22ISBN%22%2C%22A%22%29%0Amove_field%28%22Title%22%2C%22B%22%29%0Amove_field%28%22Author%22%2C%22C%22%29%0A%0Aretain%28%22A%22%2C%22B%22%2C%22C%22%29)
-You can transform the data to an tsv file with the separator \t which has no header like this.
+You can transform the data to a TSV file with the separator `\t` which has no header like this:
```text
"https://lib.ugent.be/download/librecat/data/goodreads.csv"
@@ -134,17 +132,14 @@ You can transform the data to an tsv file with the separator \t which has no hea
[See example in playground.](https://metafacture.org/playground/?flux=%22https%3A//lib.ugent.be/download/librecat/data/goodreads.csv%22%0A%7C+open-http%0A%7C+as-lines%0A%7C+decode-csv%28hasheader%3D%22true%22%29%0A%7C+fix%28transformationFile%29%0A%7C+encode-csv%28separator%3D%22\t%22%2C+noQuotes%3D%22true%22%29%0A%7C+print%3B&transformation=retain%28%22ISBN%22%2C%22Title%22%2C%22Author%22%29)
-When you create a CSV from a by export complex/nested data structures to a tabular format, you must “flatten” the datastructure. Also
-you have to be aware that the order and number of elements in every record is the same otherwise the header does not match the records.
-
-But could be done with Metafacture. But be aware that the nested structure if repeatble elements are provided have to be the identical every time. Otherwise the header and the csv file do not fit:
+When you create a CSV from complex/nested data structures to a tabular format, you must “flatten” the datastructure. Also you have to be aware that the order and number of elements in every record is the same as the header should match the records.
-https://metafacture.org/playground/?flux=%22https%3A//lobid.org/organisations/search%3Fq%3Dk%25C3%25B6ln%26size%3D10%22%0A%7C+open-http%28accept%3D%22application/json%22%29%0A%7C+as-records%0A%7C+decode-json%28recordpath%3D%22member%22%29%0A%7C+flatten%0A%7C+encode-csv%28includeheader%3D%22true%22%29%0A%7C+print%3B
+So: make sure that the nested structure of repeatable elements is identical every time. Otherwise the [header and the CSV file do not fit](https://metafacture.org/playground/?flux=%22https%3A//lobid.org/organisations/search%3Fq%3Dk%25C3%25B6ln%26size%3D10%22%0A%7C+open-http%28accept%3D%22application/json%22%29%0A%7C+as-records%0A%7C+decode-json%28recordpath%3D%22member%22%29%0A%7C+flatten%0A%7C+encode-csv%28includeheader%3D%22true%22%29%0A%7C+print%3B).
Excercises:
-- [Decode this csv keep the header.](https://metafacture.org/playground/?flux=inputFile%0A%7C+open-file%0A...%0A...%0A%7C+encode-yaml%0A%7C+print%0A%3B&data=%22id%22%2C%22name%22%2C%22creator%22%0A%221%22%2C%22Book+1%22%2C%22Maxi+Muster%22%0A%222%22%2C%22Book+2%22%2C%22Sandy+Sample%22)
-- [Create a tsv with the record idenfier (`_id`), title (`245` > `title`) and isbn (`020` > `isbn`) from a marc dump.](https://metafacture.org/playground/?flux=%22https%3A//raw.githubusercontent.com/metafacture/metafacture-core/master/metafacture-runner/src/main/dist/examples/read/marc21/10.marc21%22%0A%7C+open-http%0A%7C+as-lines%0A%7C+decode-marc21%0A%7C+fix%28transformationFile%29%0A%7C+flatten%0A%7C+encode-csv%28includeHeader%3D%22TRUE%22%2C+separator%3D%22\t%22%2C+noQuotes%3D%22false%22%29%0A%7C+print%0A%3B&transformation=)
+- [Decode this CSV while keeping the header.](https://metafacture.org/playground/?flux=inputFile%0A%7C+open-file%0A...%0A...%0A%7C+encode-yaml%0A%7C+print%0A%3B&data=%22id%22%2C%22name%22%2C%22creator%22%0A%221%22%2C%22Book+1%22%2C%22Maxi+Muster%22%0A%222%22%2C%22Book+2%22%2C%22Sandy+Sample%22)
+- [Create a TSV with the record idenfier (`_id`), title (`245` > `title`) and isbn (`020` > `isbn`) from a marc dump.](https://metafacture.org/playground/?flux=%22https%3A//raw.githubusercontent.com/metafacture/metafacture-core/master/metafacture-runner/src/main/dist/examples/read/marc21/10.marc21%22%0A%7C+open-http%0A%7C+as-lines%0A%7C+decode-marc21%0A%7C+fix%28transformationFile%29%0A%7C+flatten%0A%7C+encode-csv%28includeHeader%3D%22TRUE%22%2C+separator%3D%22\t%22%2C+noQuotes%3D%22false%22%29%0A%7C+print%0A%3B&transformation=)
---------------
diff --git a/docs/10_Working_with_XML.md b/docs/10_Working_with_XML.md
index 76127c2..a609e34 100644
--- a/docs/10_Working_with_XML.md
+++ b/docs/10_Working_with_XML.md
@@ -7,14 +7,14 @@ parent: Tutorial
# Lesson 10: Working with XML
-While CSV are one type of file format that are used for data exchange. The other one which is most famous is XML.
+CSV is one type of file format that is used for data exchange. XML is a famous other one.
-XML files are used as internal data format and exchange format.
+XML files are used as internal data serialization and exchange serialization.
Also a lot of metadata profils and data formats in the cultural heritage sector are serialized in XML:
e.g. LIDO, MODS, METS, PREMIS, MARCXML, PICAXML, DC, ONIX and so on.
-XML is decoded a little bit differently than other data formats since on the one hand
-the decoder follows straight after the opening of a file, a website or an OAI-PMH.
+XML is decoded a little bit differently than other data serializations since on the one hand
+the XML serialization is to be decoded and then the inherent format is to be treated accordingly.
Lets start with this simple record
@@ -40,12 +40,12 @@ inputFile
[See it here in the Playground.](https://metafacture.org/playground/?flux=inputFile%0A%7C+open-file%0A%7C+as-records%0A%7C+print%0A%3B&data=%3C%3Fxml+version%3D%221.0%22+encoding%3D%22utf-8%22%3F%3E%0A%3Crecord%3E%0A++%3Ctitle%3EGRM%3C/title%3E%0A++%3Cauthor%3ESibille+Berg%3C/author%3E%0A++%3CdatePublished%3E2019%3C/datePublished%3E%0A%3C/record%3E)
-Next lets decode the file and encode it as Yaml.
+Next let's decode the file and encode it as YAML.
-But to decode it as xml we have to use the `decode-xml` command. But using only the decoder does not help. We additionally need a handler for xml.
-Handlers a specific helpers that decode xml in a certain way, based on the metadata standard that this xml is based on.
+T decode it as XML the `decode-xml` command is used. But using only the decoder does not help. We additionally need a handler for XML.
+Handlers are specific helpers that de-format XML in a certain way, based on the metadata standard that this XML is based on.
-For now we need the `handle-generic-xml` function.
+For now we need the `handle-generic-xml` function:
```text
inputFile
@@ -72,10 +72,8 @@ datePublished:
value: "2019"
```
-What is special about the handling, it that the values of the different xml-elements are not decoded straigt as the value of the element but as a subfield called value.
-This is due to the fact that xml element cant have a value and additional attributes and to catch both MF introduces subfields for the value and potential attributes:
-
-https://metafacture.org/playground/?flux=inputFile%0A%7C+open-file%0A%7C+decode-xml%0A%7C+handle-generic-xml%0A%7C+encode-yaml%0A%7C+print%0A%3B&data=%3C%3Fxml+version%3D%221.0%22+encoding%3D%22utf-8%22%3F%3E%0A%3Crecord%3E%0A++%3Ctitle+attribute%3D%22test%22%3ETest+value%3C/title%3E%0A%3C/record%3E
+What is special about the handling is that the values of the different XML elements are not decoded straigt as the value of the element but as a subfield called `value`.
+This is due to the fact that XML elementis can't have a value _and_ additional attributes and to catch both [MF introduces subfields for the value and potential attributes](https://metafacture.org/playground/?flux=inputFile%0A%7C+open-file%0A%7C+decode-xml%0A%7C+handle-generic-xml%0A%7C+encode-yaml%0A%7C+print%0A%3B&data=%3C%3Fxml+version%3D%221.0%22+encoding%3D%22utf-8%22%3F%3E%0A%3Crecord%3E%0A++%3Ctitle+attribute%3D%22test%22%3ETest+value%3C/title%3E%0A%3C/record%3E).
See:
@@ -83,7 +81,7 @@ See:
Test value
```
-With the Flux:
+with the Flux:
```text
inputFile
@@ -94,6 +92,7 @@ inputFile
| print
;
```
+results in:
```yaml
title:
@@ -101,10 +100,10 @@ title:
value: "Test value"
```
-[For our example above to get rid of the value subfields in the yaml we need to change the hirachy:](https://metafacture.org/playground/?flux=inputFile%0A%7C+open-file%0A%7C+decode-xml%0A%7C+handle-generic-xml%0A%7C+fix%28transformationFile%29%0A%7C+encode-yaml%0A%7C+print%0A%3B&transformation=move_field%28%22title.value%22%2C%22@title%22%29%0Amove_field%28%22@title%22%2C%22title%22%29%0Amove_field%28%22author.value%22%2C%22@author%22%29%0Amove_field%28%22@author%22%2C%22author%22%29%0Amove_field%28%22datePublished.value%22%2C%22@datePublished%22%29%0Amove_field%28%22@datePublished%22%2C%22datePublished%22%29&data=%3C%3Fxml+version%3D%221.0%22+encoding%3D%22utf-8%22%3F%3E%0A%3Crecord%3E%0A++%3Ctitle%3EGRM%3C/title%3E%0A++%3Cauthor%3ESibille+Berg%3C/author%3E%0A++%3CdatePublished%3E2019%3C/datePublished%3E%0A%3C/record%3E)
+[For our example above, to get rid of the value subfields in the yaml, we need to change the hirachy:](https://metafacture.org/playground/?flux=inputFile%0A%7C+open-file%0A%7C+decode-xml%0A%7C+handle-generic-xml%0A%7C+fix%28transformationFile%29%0A%7C+encode-yaml%0A%7C+print%0A%3B&transformation=move_field%28%22title.value%22%2C%22@title%22%29%0Amove_field%28%22@title%22%2C%22title%22%29%0Amove_field%28%22author.value%22%2C%22@author%22%29%0Amove_field%28%22@author%22%2C%22author%22%29%0Amove_field%28%22datePublished.value%22%2C%22@datePublished%22%29%0Amove_field%28%22@datePublished%22%2C%22datePublished%22%29&data=%3C%3Fxml+version%3D%221.0%22+encoding%3D%22utf-8%22%3F%3E%0A%3Crecord%3E%0A++%3Ctitle%3EGRM%3C/title%3E%0A++%3Cauthor%3ESibille+Berg%3C/author%3E%0A++%3CdatePublished%3E2019%3C/datePublished%3E%0A%3C/record%3E)
-```
+```text
inputFile
| open-file
| decode-xml
@@ -115,7 +114,8 @@ inputFile
;
```
-With Fix:
+with Fix:
+
```perl
move_field("title.value","@title")
move_field("@title","title")
@@ -125,7 +125,7 @@ move_field("datePublished.value","@datePublished")
move_field("@datePublished","datePublished")
```
-But when you encode it to XML the value subfields are also kept. Like this:
+But when it's encoded into XML the value subfields are also kept. Like this:
```text
inputFile
@@ -136,7 +136,8 @@ inputFile
| print
;
```
-Results in:
+
+results in:
```xml
@@ -159,9 +160,9 @@ Results in:
[Playground Link](https://metafacture.org/playground/?flux=inputFile%0A%7C+open-file%0A%7C+decode-xml%0A%7C+handle-generic-xml%0A%7C+encode-xml%0A%7C+print%0A%3B&data=%3C%3Fxml+version%3D%221.0%22+encoding%3D%22utf-8%22%3F%3E%0A%3Crecord%3E%0A++%3Ctitle%3EGRM%3C/title%3E%0A++%3Cauthor%3ESibille+Berg%3C/author%3E%0A++%3CdatePublished%3E2019%3C/datePublished%3E%0A%3C/record%3E)
-Keep in mind that xml elements can have attributes and a value. But also the encoder enable simple flat xml records too.
+Keep in mind that XML elements can have attributes and a value. The encoder `encode-xml` enables simple flat XML records, when specified to.
-You have to add a specific option when encoding xml: `| encode-xml(valueTag="value")` . Then it results in:
+You have to add a specific option when encoding XML: `| encode-xml(valueTag="value")` . Then it results in:
```xml
@@ -177,8 +178,8 @@ You have to add a specific option when encoding xml: `| encode-xml(valueTag="val
```
-If you want to create the other elements as attributes. You have to tell MF which elements are attributes by adding a attributeMarker with the option `attributemarker` in handle generic xml.
-Here I use `@` as attribute marker:
+If you want to create the other elements as attributes you have to tell MF which elements are attributes by adding an "attribute marker" with the option `attributemarker` in `handle-generic-xml`.
+Here an `@` acts as the attribute marker:
```text
inputFile
@@ -192,7 +193,7 @@ inputFile
[Playground Link](https://metafacture.org/playground/?flux=inputFile%0A%7C+open-file%0A%7C+decode-xml%0A%7C+handle-generic-xml%28attributeMarker%3D%22@%22%29%0A%7C+encode-xml%28attributeMarker%3D%22@%22%2CvalueTag%3D%22value%22%29%0A%7C+print%0A%3B&data=%3C%3Fxml+version%3D%221.0%22+encoding%3D%22utf-8%22%3F%3E%0A%3Crecord%3E%0A++%3Ctitle+attribute%3D%22test%22%3ETest+value%3C/title%3E%0A%3C/record%3E)
-When you encode it as yaml you see the magic behind it:
+When you encode it as YAML you see the magic behind it:
```text
inputFile
@@ -206,7 +207,7 @@ inputFile
[Playground Link](https://metafacture.org/playground/?flux=inputFile%0A%7C+open-file%0A%7C+decode-xml%0A%7C+handle-generic-xml%28attributeMarker%3D%22@%22%29%0A%7C+encode-yaml%0A%7C+print%0A%3B&data=%3C%3Fxml+version%3D%221.0%22+encoding%3D%22utf-8%22%3F%3E%0A%3Crecord%3E%0A++%3Ctitle+attribute%3D%22test%22%3ETest+value%3C/title%3E%0A%3C/record%3E)
-Another important thing, when working with xml data sets is to specify the record tag. Default is the tag record. But other data sets have different tags that separate records:
+Another important thing when working with XML data sets is to specify the record tag. The default record tag is "record". But other data sets have different tags to separate records:
```text
"http://www.lido-schema.org/documents/examples/LIDO-v1.1-Example_FMobj00154983-LaPrimavera.xml"
@@ -241,8 +242,8 @@ Add this option to the previous example and see that there are elements belongin
See this in the Playground [here](https://metafacture.org/playground/?flux=%22http%3A//www.lido-schema.org/documents/examples/LIDO-v1.1-Example_FMobj00154983-LaPrimavera.xml%22%0A%7C+open-http%0A%7C+decode-xml%0A%7C+handle-generic-xml%28recordtagname%3D%22lido%22%2C+emitnamespace%3D%22true%22%29%0A%7C+encode-yaml%0A%7C+print%0A%3B).
-When you want to add the namespace definition to the output metafacture does not know that by itself but you have to tell metafacture
-the new namespace when `encoding-xml` either by a file with the option `namespacefile` or in the flux with the option `namespaces`.
+When you want to add the namespace definition to the output Metafacture does not know that by itself. So you have to tell Metafacture
+the new namespace when `encoding-xml` either by a file with the option `namespacefile` or in the Flux with the option `namespaces`, where the multiple namepaces are separated by an `\n`.
See here an example for adding namespaces in the flux:
diff --git a/docs/11_MARC_to_Dublin_Core.md b/docs/11_MARC_to_Dublin_Core.md
index b036c1d..939f0f3 100644
--- a/docs/11_MARC_to_Dublin_Core.md
+++ b/docs/11_MARC_to_Dublin_Core.md
@@ -6,14 +6,15 @@ parent: Tutorial
---
-# Lesson 11 : From MARC to Dublin Core as loud JSON-LD
+## Lesson 11 : From MARC to Dublin Core as Linked Open Usable Data (LOUD)
+
TODO: Use better example. But the following is missing isbns: https://github.com/metafacture/metafacture-examples/blob/master/Swissbib-Extensions/MARC-CSV/
-Today we will look a bit further into MARC processing with Metafacture. We already saw a bit of MARC processing in and today we will show you how to transform MARC records into Dublin Core and providing the data as linked open usable data.
+Today we will look a bit further into MARC processing with Metafacture. We already saw a bit of MARC processing and today we will transform MARC records into Dublin Core providing the data as linked open usable data.
-To transform this MARC file into Dublin Core we need to create a fix file. You can use any texteditor for this and create a file dublin.fix (or use the transformationFile window in the playground):
+To transform this MARC file into Dublin Core we need to create a Fix file. You can use any text editor for this and create a file dublin.fix (or use the `transformationFile` window in the Playground).
-And type into this textfile the following fixes:
+Type into the text file the following Fix commands:
```perl
copy_field("245??.a","title")
@@ -37,20 +38,20 @@ end
retain("title","creator[]","date","publisher","isbn[]","issn[]","subject[]")
```
-Every MARC record contains in the 245-field the title of a record. In the first line we map the MARC-245 field to new field in the record called title:
+Every MARC record contains in the 245-field the title of a record. In the first line we map the MARC-245 field to a new field in the record called title:
`copy_field("245??.a","title")``
-In the line 2-4 we map authors to a field creator. In the the marc records the authors are stored in the MARC-100 and MARC-700 field. Because there is usually more than one author in a record, we need to $append them to create an array (a list) of one or more creator-s.
+In the line 2-4 we map authors to a field "creator". In the the MARC records the authors are stored in the MARC-100 and MARC-700 field. Because there is usually more than one author in a record, we need to `$append` them to create an array (a list) of one or more creator-s.
-In line 5 and line 6 we read the MARC-260 field which contains publisher and date information. Here we don’t need the $append trick because there is usually only one 260-field in a MARC record.
+In line 5 and line 6 we read the MARC-260 field which contains publisher and date information. Here we don’t need the `$append` trick because there is usually only one 260-field in a MARC record.
-In line 7 to line 15 we do the same trick to filter out the ISBN and ISSN number out of the record which we store in separate fields isbn and issn (indeed these are not Dublin Core fields, we will process them later). But because these elements can be repeated we iterate over them with a list bind and copy the values in an array.
+In line 7 to line 15 we do the same trick to filter out the ISBN and ISSN number out of the record which we store in separate fields "isbn" and "issn" (indeed these are not Dublin Core fields, we will process them later). But because these elements can be repeated we iterate over them with a list bind and copy the values in an array.
-In line 16-19 the subjects are to extracted from the 260-field using the same $append trick as above. Notice that we only extracted the $a subfields?
+In line 16-19 the subjects are extracted from the 260-field using the same `$append` trick as above. Notice that we only extracted the `$a` subfields.
-We end the fix and retain only those elements that we want to keep.
+We end the Fix and retain only those elements that we want to keep.
-Given the dublin.txt file above we can execute the filtering command like this:
+Given the `dublin.txt` file above we can execute the filtering command like this:
TODO: Explain how to run the function with CLI.
@@ -87,7 +88,7 @@ title: Propositional structure and illocutionary force :a study of the contribut
...
```
-Congratulations, you’ve created your first mapping file to transform library data from MARC to Dublin Core! We need to add a bit more cleaning to delete some periods and commas here and there but as is we already have our first mapping.
+Congratulations, you’ve created your first mapping file to transform library data from MARC to Dublin Core! We need to add a bit more cleaning to delete some periods and commas here and there but as it is we already have our first mapping.
Below you’ll find a complete example. You can read more about our Fix language online.
@@ -120,9 +121,9 @@ end
retain("title","creator[]","date","publisher","isbn[]","issn[]","subject[]")
```
-We can turn this data also to JSON-LD by adding a context that specifies the elements with URIs.
+We can turn this data to JSON-LD by adding a context that specifies the elements with URIs.
-Add the following fix to the fix above:
+Add the following Fix to the Fix above:
```perl
add_field("@context.title","http://purl.org/dc/terms/title")