You can not select more than 25 topics Topics must start with a chinese character,a letter or number, can include dashes ('-') and can be up to 35 characters long.

tutorial.rst 20 kB

first commit Former-commit-id: 08bc23ba02cffbce3cf63962390a65459a132e48 [formerly 0795edd4834b9b7dc66db8d10d4cbaf42bbf82cb] [formerly b5010b42541add7e2ea2578bf2da537efc457757 [formerly a7ca09c2c34c4fc8b3d8e01fcfa08eeeb2cae99d]] [formerly 615058473a2177ca5b89e9edbb797f4c2a59c7e5 [formerly 743d8dfc6843c4c205051a8ab309fbb2116c895e] [formerly bb0ea98b1e14154ef464e2f7a16738705894e54b [formerly 960a69da74b81ef8093820e003f2d6c59a34974c]]] [formerly 2fa3be52c1b44665bc81a7cc7d4cea4bbf0d91d5 [formerly 2054589f0898627e0a17132fd9d4cc78efc91867] [formerly 3b53730e8a895e803dfdd6ca72bc05e17a4164c1 [formerly 8a2fa8ab7baf6686d21af1f322df46fd58c60e69]] [formerly 87d1e3a07a19d03c7d7c94d93ab4fa9f58dada7c [formerly f331916385a5afac1234854ee8d7f160f34b668f] [formerly 69fb3c78a483343f5071da4f7e2891b83a49dd18 [formerly 386086f05aa9487f65bce2ee54438acbdce57650]]]] Former-commit-id: a00aed8c934a6460c4d9ac902b9a74a3d6864697 [formerly 26fdeca29c2f07916d837883983ca2982056c78e] [formerly 0e3170d41a2f99ecf5c918183d361d4399d793bf [formerly 3c12ad4c88ac5192e0f5606ac0d88dd5bf8602dc]] [formerly d5894f84f2fd2e77a6913efdc5ae388cf1be0495 [formerly ad3e7bc670ff92c992730d29c9d3aa1598d844e8] [formerly 69fb3c78a483343f5071da4f7e2891b83a49dd18]] Former-commit-id: 3c19c9fae64f6106415fbc948a4dc613b9ee12f8 [formerly 467ddc0549c74bb007e8f01773bb6dc9103b417d] [formerly 5fa518345d958e2760e443b366883295de6d991c [formerly 3530e130b9fdb7280f638dbc2e785d2165ba82aa]] Former-commit-id: 9f5d473d42a435ec0d60149939d09be1acc25d92 [formerly be0b25c4ec2cde052a041baf0e11f774a158105d] Former-commit-id: 9eca71cb73ba9edccd70ac06a3b636b8d4093b04
5 years ago
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290291292293294295296297298299300301302303304305306307308309310311312313314315316317318319320321322323324325326327328329330331332333334335336337338339340341342343344345346347348349350351352353354355356357358359360361362363364365366367368369370371372373374375376377378379380381382383384385386387388389390391392393394395396397398399400401402403404405406407408409410411412413414415416417418419420421422423424425426427428429430431432433434435436437438439440441442443444445446447448449450451452453454455456457458459460461462463464465466467468469470471472473474475476477478479480481482483484485486487488489490491492493
  1. Advanced Tutorial
  2. =================
  3. This tutorial assumes the reader is familiar with d3m ecosystem in general.
  4. If not, please refer to other sections of `documentation`_ first, e.g.,
  5. :ref:`quickstart`.
  6. .. _documentation: https://docs.datadrivendiscovery.org
  7. Overview of building a primitive
  8. --------------------------------
  9. 1. :ref:`Recognize the base class of a primitive <primitive-class>`.
  10. 2. :ref:`Identify the input and output container types <input-output-types>`.
  11. 3. :ref:`Define metadata for each primitive <tutorial-primitive-metadata>`.
  12. 4. :ref:`Write a unit test to verify the primitive functions <unit-tests>`.
  13. 5. :ref:`Generate the primitive annotation for the primitive <primitive-annotation>`.
  14. 6. :ref:`Write pipeline for demonstrating primitive functionality <example-pipeline>`.
  15. 7. :ref:`Advanced: Primitive might use static files <static-files>`.
  16. .. _primitive-class:
  17. Primitive class
  18. ---------------
  19. There are a variety of :py:mod:`primitive interfaces/classes <d3m.primitive_interfaces>` available. As an example,
  20. a primitive doing just attribute extraction without requiring any fitting, a :py:class:`~d3m.primitive_interfaces.transformer.TransformerPrimitiveBase`
  21. from :py:mod:`~d3m.primitive_interfaces.transformer` module can be used.
  22. Each primitives can have it's own :py:mod:`hyper-parameters <d3m.metadata.hyperparams>`. Some example hyper-parameter types one can use to describe
  23. primitive's hyper-parameters are: :py:class:`~d3m.metadata.hyperparams.Constant`, :py:class:`~d3m.metadata.hyperparams.UniformBool`,
  24. :py:class:`~d3m.metadata.hyperparams.UniformInt`, :py:class:`~d3m.metadata.hyperparams.Choice`, :py:class:`~d3m.metadata.hyperparams.List`.
  25. Also, each hyper-parameter should be defined as one or more of the four :ref:`hyper-parameter semantic types <hyperparameters>`:
  26. * `https://metadata.datadrivendiscovery.org/types/TuningParameter <https://metadata.datadrivendiscovery.org/types/TuningParameter>`__
  27. * `https://metadata.datadrivendiscovery.org/types/ControlParameter <https://metadata.datadrivendiscovery.org/types/ControlParameter>`__
  28. * `https://metadata.datadrivendiscovery.org/types/ResourcesUseParameter <https://metadata.datadrivendiscovery.org/types/ResourcesUseParameter>`__
  29. * `https://metadata.datadrivendiscovery.org/types/MetafeatureParameter <https://metadata.datadrivendiscovery.org/types/MetafeatureParameter>`__
  30. Example
  31. ~~~~~~~
  32. .. code:: python
  33. from d3m.primitive_interfaces import base, transformer
  34. from d3m.metadata import base as metadata_base, hyperparams
  35. __all__ = ('ExampleTransformPrimitive',)
  36. class Hyperparams(hyperparams.Hyperparams):
  37. learning_rate = hyperparams.Uniform(lower=0.0, upper=1.0, default=0.001, semantic_types=[
  38. 'https://metadata.datadrivendiscovery.org/types/TuningParameter',
  39. ])
  40. clusters = hyperparams.UniformInt(lower=1, upper=100, default=10, semantic_types=[
  41. 'https://metadata.datadrivendiscovery.org/types/TuningParameter',
  42. ])
  43. class ExampleTransformPrimitive(transformer.TransformerPrimitiveBase[Inputs, Outputs, Hyperparams]):
  44. """
  45. The docstring is very important and must to be included. It should contain
  46. relevant information about the hyper-parameters, primitive functionality, etc.
  47. """
  48. def produce(self, *, inputs: Inputs, timeout: float = None, iterations: int = None) -> base.CallResult[Outputs]:
  49. pass
  50. .. _input-output-types:
  51. Input/Output types
  52. ------------------
  53. The acceptable inputs/outputs of a primitive must be pre-defined. D3M supports a variety of
  54. standard input/output :ref:`container types <container_types>` such as:
  55. - ``pandas.DataFrame`` (as :py:class:`d3m.container.pandas.DataFrame`)
  56. - ``numpy.ndarray`` (as :py:class:`d3m.container.numpy.ndarray`)
  57. - ``list`` (as :py:class:`d3m.container.list.List`)
  58. .. note::
  59. Even thought D3M container types behave mostly as standard types, the D3M container types must be used for inputs/outputs, because D3M container types support D3M metadata.
  60. Example
  61. ~~~~~~~
  62. .. code:: python
  63. from d3m import container
  64. Inputs = container.DataFrame
  65. Outputs = container.DataFrame
  66. class ExampleTransformPrimitive(transformer.TransformerPrimitiveBase[Inputs, Outputs, Hyperparams]):
  67. ...
  68. .. note::
  69. When returning the output DataFrame, its metadata should be updated with the correct semantic and structural types.
  70. Example
  71. ~~~~~~~
  72. .. code:: python
  73. # Update metadata for each DataFrame column.
  74. for column_index in range(outputs.shape[1]):
  75. column_metadata = {}
  76. column_metadata['structural_type'] = type(1.0)
  77. column_metadata['name'] = "column {i}".format(i=column_index)
  78. column_metadata["semantic_types"] = ("http://schema.org/Float", "https://metadata.datadrivendiscovery.org/types/Attribute",)
  79. outputs.metadata = outputs.metadata.update((metadata_base.ALL_ELEMENTS, column_index), column_metadata)
  80. .. _tutorial-primitive-metadata:
  81. Primitive Metadata
  82. ------------------
  83. It is very crucial to define :ref:`primitive metadata <primitive-metadata>` for the primitive properly.
  84. Primitive metadata can be used by TA2 systems to metalearn about primitives and in general decide which primitive to use when.
  85. Example
  86. ~~~~~~~
  87. .. code:: python
  88. from d3m.primitive_interfaces import base, transformer
  89. from d3m.metadata import base as metadata_base, hyperparams
  90. __all__ = ('ExampleTransformPrimitive',)
  91. class ExampleTransformPrimitive(transformer.TransformerPrimitiveBase[Inputs, Outputs, Hyperparams]):
  92. """
  93. Docstring.
  94. """
  95. metadata = metadata_base.PrimitiveMetadata({
  96. 'id': <Unique-ID, generated using UUID>,
  97. 'version': <Primitive-development-version>,
  98. 'name': <Primitive-Name>,
  99. 'python_path': 'd3m.primitives.<>.<>.<>' # Must match path in setup.py,
  100. 'source': {
  101. 'name': <Project-maintainer-name>,
  102. 'uris': [<GitHub-link-to-project>],
  103. 'contact': 'mailto:<Author E-Mail>'
  104. },
  105. 'installation': [{
  106. 'type': metadata_base.PrimitiveInstallationType.PIP,
  107. 'package_uri': 'git+<git-link-to-project>@{git_commit}#egg=<Package_name>'.format(
  108. git_commit=d3m_utils.current_git_commit(os.path.dirname(__file__)),
  109. ),
  110. }],
  111. 'algorithm_types': [
  112. # Check https://metadata.datadrivendiscovery.org/devel/?definitions#definitions.algorithm_types for all available algorithm types.
  113. # If algorithm type s not available a Merge Request should be made to add it to core package.
  114. metadata_base.PrimitiveAlgorithmType.<Choose-the-algorithm-type-that-best-describes-the-primitive>,
  115. ],
  116. # Check https://metadata.datadrivendiscovery.org/devel/?definitions#definitions.primitive_family for all available primitive family types.
  117. # If primitive family is not available a Merge Request should be made to add it to core package.
  118. 'primitive_family': metadata_base.PrimitiveFamily.<Choose-the-primitive-family-that-closely-associates-to-the-primitive>
  119. })
  120. ...
  121. .. _unit-tests:
  122. Unit tests
  123. ----------
  124. Once the primitives are constructed, unit testing must be done to see if the
  125. primitive works as intended.
  126. **Sample Setup**
  127. .. code:: python
  128. import os
  129. import unittest
  130. from d3m.container import dataset
  131. from d3m.metadata import base as metadata_base
  132. from common_primitives import dataset_to_dataframe
  133. from example_primitive import ExampleTransformPrimitive
  134. class ExampleTransformTest(unittest.TestCase):
  135. def test_happy_path():
  136. # Load a dataset.
  137. # Datasets can be obtained from: https://datasets.datadrivendiscovery.org/d3m/datasets
  138. base_path = '../datasets/training_datasets/seed_datasets_archive/'
  139. dataset_doc_path = os.path.join(base_path, '38_sick_dataset', 'datasetDoc.json')
  140. dataset = dataset.Dataset.load('file://{dataset_doc_path}'.format(dataset_doc_path=dataset_doc_path))
  141. dataframe_hyperparams_class = dataset_to_dataframe.DatasetToDataFramePrimitive.metadata.get_hyperparams()
  142. dataframe_primitive = dataset_to_dataframe.DatasetToDataFramePrimitive(hyperparams=dataframe_hyperparams_class.defaults())
  143. dataframe = dataframe_primitive.produce(inputs=dataset).value
  144. # Call example transformer.
  145. hyperparams_class = SampleTransform.metadata.get_hyperparams()
  146. primitive = SampleTransform(hyperparams=hyperparams_class.defaults())
  147. test_out = primitive.produce(inputs=dataframe).value
  148. # Write assertions to make sure that the output (type, shape, metadata) is what is expected.
  149. self.assertEqual(...)
  150. ...
  151. if __name__ == '__main__':
  152. unittest.main()
  153. It is recommended to do the testing inside the D3M Docker container:
  154. .. code:: shell
  155. docker run --rm -v /home/foo/d3m:/mnt/d3m -it \
  156. registry.gitlab.com/datadrivendiscovery/images/primitives:ubuntu-bionic-python36-v2020.1.9
  157. cd /mnt/d3m/example_primitive
  158. python3 primitive_name_test.py
  159. .. _primitive-annotation:
  160. Primitive annotation
  161. --------------------
  162. Once primitive is constructed and unit testing is successful, the
  163. final step in building a primitive is to generate the primitive annotation
  164. which will be indexed and used by D3M.
  165. .. code:: shell
  166. docker run --rm -v /home/foo/d3m:/mnt/d3m -it \
  167. registry.gitlab.com/datadrivendiscovery/images/primitives:ubuntu-bionic-python36-v2020.1.9
  168. cd /mnt/d3m/example_primitive
  169. pip3 install -e .
  170. python3 -m d3m index describe -i 4 <primitive_name>
  171. Alternatively, a `helper script <https://gitlab.com/datadrivendiscovery/docs-quickstart/-/blob/master/quickstart_primitives/generate-primitive-json.py>`__
  172. can be used to generate primitive annotations as well.
  173. This can be more convenient when having to manage multiple primitives.
  174. In this case, generating the primitive annotation is done as follows:
  175. .. code:: shell
  176. docker run --rm -v /home/foo/d3m:/mnt/d3m -it \
  177. registry.gitlab.com/datadrivendiscovery/images/primitives:ubuntu-bionic-python36-v2020.1.9
  178. cd /mnt/d3m/example_primitive
  179. pip3 install -e .
  180. python3 generate-primitive-json.py ...
  181. .. _example-pipeline:
  182. Example pipeline
  183. ----------------
  184. After building custom primitives, it has to be used in an example pipeline and run using one of
  185. D3M seed datasets in order to be integrated with other indexed D3M primitives.
  186. The essential elements of pipelines are:
  187. ``Dataset Denormalizer -> Dataset Parser -> Data Cleaner (If necessary) -> Feature Extraction -> Classifier/Regressor -> Output``
  188. An example code of building pipeline is shown below:
  189. .. code:: python
  190. # D3M dependencies
  191. from d3m import index
  192. from d3m.metadata.base import ArgumentType
  193. from d3m.metadata.pipeline import Pipeline, PrimitiveStep
  194. # Common Primitives
  195. from common_primitives.column_parser import ColumnParserPrimitive
  196. from common_primitives.dataset_to_dataframe import DatasetToDataFramePrimitive
  197. from common_primitives.extract_columns_semantic_types import ExtractColumnsBySemanticTypesPrimitive
  198. # Testing primitive
  199. from quickstart_primitives.sample_primitive1.input_to_output import InputToOutputPrimitive
  200. # Pipeline
  201. pipeline = Pipeline()
  202. pipeline.add_input(name='inputs')
  203. # Step 0: DatasetToDataFrame (Dataset Denormalizer)
  204. step_0 = PrimitiveStep(primitive_description=DatasetToDataFramePrimitive.metadata.query())
  205. step_0.add_argument(name='inputs', argument_type=ArgumentType.CONTAINER, data_reference='inputs.0')
  206. step_0.add_output('produce')
  207. pipeline.add_step(step_0)
  208. # Step 1: Custom primitive
  209. step_1 = PrimitiveStep(primitive=InputToOutputPrimitive)
  210. step_1.add_argument(name='inputs', argument_type=ArgumentType.CONTAINER, data_reference='steps.0.produce')
  211. step_1.add_output('produce')
  212. pipeline.add_step(step_1)
  213. # Step 2: Column Parser (Dataset Parser)
  214. step_2 = PrimitiveStep(primitive_description=ColumnParserPrimitive.metadata.query())
  215. step_2.add_argument(name='inputs', argument_type=ArgumentType.CONTAINER, data_reference='steps.1.produce')
  216. step_2.add_output('produce')
  217. pipeline.add_step(step_2)
  218. # Step 3: Extract Attributes (Feature Extraction)
  219. step_3 = PrimitiveStep(primitive_description=ExtractColumnsBySemanticTypesPrimitive.metadata.query())
  220. step_3.add_argument(name='inputs', argument_type=ArgumentType.CONTAINER, data_reference='steps.2.produce')
  221. step_3.add_output('produce')
  222. step_3.add_hyperparameter(name='semantic_types', argument_type=ArgumentType.VALUE, data=['https://metadata.datadrivendiscovery.org/types/Attribute'] )
  223. pipeline.add_step(step_3)
  224. # Step 4: Extract Targets (Feature Extraction)
  225. step_4 = PrimitiveStep(primitive_description=ExtractColumnsBySemanticTypesPrimitive.metadata.query())
  226. step_4.add_argument(name='inputs', argument_type=ArgumentType.CONTAINER, data_reference='steps.0.produce')
  227. step_4.add_output('produce')
  228. step_4.add_hyperparameter(name='semantic_types', argument_type=ArgumentType.VALUE, data=['https://metadata.datadrivendiscovery.org/types/TrueTarget'] )
  229. pipeline.add_step(step_4)
  230. attributes = 'steps.3.produce'
  231. targets = 'steps.4.produce'
  232. # Step 6: Imputer (Data Cleaner)
  233. step_5 = PrimitiveStep(primitive=index.get_primitive('d3m.primitives.data_cleaning.imputer.SKlearn'))
  234. step_5.add_argument(name='inputs', argument_type=ArgumentType.CONTAINER, data_reference=attributes)
  235. step_5.add_output('produce')
  236. pipeline.add_step(step_5)
  237. # Step 7: Classifier
  238. step_6 = PrimitiveStep(primitive=index.get_primitive('d3m.primitives.classification.decision_tree.SKlearn'))
  239. step_6.add_argument(name='inputs', argument_type=ArgumentType.CONTAINER, data_reference='steps.5.produce')
  240. step_6.add_argument(name='outputs', argument_type=ArgumentType.CONTAINER, data_reference=targets)
  241. step_6.add_output('produce')
  242. pipeline.add_step(step_6)
  243. # Final Output
  244. pipeline.add_output(name='output predictions', data_reference='steps.6.produce')
  245. # print(pipeline.to_json())
  246. with open('./pipeline.json', 'w') as write_file:
  247. write_file.write(pipeline.to_json(indent=4, sort_keys=False, ensure_ascii=False))
  248. Once pipeline is constructed and the pipeline's JSON file is generated, the pipeline is run using
  249. ``python3 -m d3m runtime`` command.
  250. Successfully running the pipeline validates that the primitive is working as intended.
  251. .. code:: shell
  252. docker run --rm -v /home/foo/d3m:/mnt/d3m -it \
  253. registry.gitlab.com/datadrivendiscovery/images/primitives:ubuntu-bionic-python36-v2020.1.9 \
  254. /bin/bash -c "cd /mnt/d3m; \
  255. pip3 install -e .; \
  256. cd pipelines; \
  257. python3 -m d3m runtime fit-produce \
  258. --pipeline pipeline.json \
  259. --problem /datasets/seed_datasets_current/38_sick/TRAIN/problem_TRAIN/problemDoc.json \
  260. --input /datasets/seed_datasets_current/38_sick/TRAIN/dataset_TRAIN/datasetDoc.json \
  261. --test-input /datasets/seed_datasets_current/38_sick/TEST/dataset_TEST/datasetDoc.json \
  262. --output 38_sick_results.csv \
  263. --output-run pipeline_run.yml; \
  264. exit"
  265. .. _static-files:
  266. Advanced: Primitive with static files
  267. -------------------------------------
  268. When building primitives that uses external/static files i.e. pre-trained weights, the
  269. metadata for the primitive must be properly define such dependency.
  270. The static file can be hosted anywhere based on your preference, as long as the URL to the file is a direct download link. It must
  271. be public so that users of your primitive can access the file. Be sure to keep the URL available, as
  272. the older version of the primitive could potentially start failing if URL stops resolving.
  273. .. note::
  274. Full code of this section can be found in the `quickstart repository <https://gitlab.com/datadrivendiscovery/docs-quickstart>`__.
  275. Below is a description of primitive metadata definition required, named ``_weights_configs`` for
  276. each static file.
  277. .. code:: python
  278. _weights_configs = [{
  279. 'type': 'FILE',
  280. 'key': '<Weight File Name>',
  281. 'file_uri': '<URL to directly Download the Weight File>',
  282. 'file_digest':'sha256sum of the <Weight File>',
  283. }]
  284. This ``_weights_configs`` should be directly added to the ``INSTALLATION`` field of the primitive metadata.
  285. .. code:: python
  286. from d3m.primitive_interfaces import base, transformer
  287. from d3m.metadata import base as metadata_base, hyperparams
  288. __all__ = ('ExampleTransform',)
  289. class ExampleTransform(transformer.TransformerPrimitiveBase[Inputs, Outputs, Hyperparams]):
  290. """
  291. Docstring.
  292. """
  293. _weights_configs = [{
  294. 'type': 'FILE',
  295. 'key': '<Weight File Name>',
  296. 'file_uri': '<URL to directly Download the Weight File>',
  297. 'file_digest':'sha256sum of the <Weight File>',
  298. }]
  299. metadata = ...
  300. 'installation': [{
  301. 'type': metadata_base.PrimitiveInstallationType.PIP,
  302. 'package_uri': 'git+<git-link-to-project>@{git_commit}#egg=<Package_name>'.format(
  303. git_commit=d3m_utils.current_git_commit(os.path.dirname(__file__)),
  304. ),
  305. }] + _weights_configs,
  306. ...
  307. ...
  308. After the primitive metadata definition, it is important to include code to return the path of files.
  309. An example is given as follows:
  310. .. code:: python
  311. def _find_weights_path(self, key_filename):
  312. if key_filename in self.volumes:
  313. weight_file_path = self.volumes[key_filename]
  314. else:
  315. weight_file_path = os.path.join('.', self._weights_configs['file_digest'], key_filename)
  316. if not os.path.isfile(weight_file_path):
  317. raise ValueError(
  318. "Can't get weights file from volumes by key '{key_filename}' and at path '{path}'.".format(
  319. key_filename=key_filename,
  320. path=weight_file_path,
  321. ),
  322. )
  323. return weight_file_path
  324. In this example code, ``_find_weights_path`` method will try to find the static files from volumes based on weight file key.
  325. If it cannot be found (e.g., runtime was not provided with static files), then it looks into the current directory.
  326. The latter fallback is useful during development.
  327. To run a pipeline with such primitive, you have to download static files and provide them to the runtime:
  328. .. code:: shell
  329. docker run --rm -v /home/foo/d3m:/mnt/d3m -it \
  330. registry.gitlab.com/datadrivendiscovery/images/primitives:ubuntu-bionic-python36-v2020.1.9 \
  331. /bin/bash -c "cd /mnt/d3m; \
  332. pip3 install -e .; \
  333. cd pipelines; \
  334. mkdir /static
  335. python3 -m d3m index download -p d3m.primitives.path.of.Primitive -o /static; \
  336. python3 -m d3m runtime --volumes /static fit-produce \
  337. --pipeline feature_pipeline.json \
  338. --problem /datasets/seed_datasets_current/22_handgeometry/TRAIN/problem_TRAIN/problemDoc.json \
  339. --input /datasets/seed_datasets_current/22_handgeometry/TRAIN/dataset_TRAIN/datasetDoc.json \
  340. --test-input /datasets/seed_datasets_current/22_handgeometry/TEST/dataset_TEST/datasetDoc.json \
  341. --output 22_handgeometry_results.csv \
  342. --output-run feature_pipeline_run.yml; \
  343. exit"
  344. The static files will be downloaded and stored locally based on ``file_digest`` of ``_weights_configs``.
  345. In this way we don't duplicate same files used by multiple primitives:
  346. .. code:: shell
  347. mkdir /static
  348. python3 -m d3m index download -p d3m.primitives.path.of.Primitive -o /static
  349. ``-p`` optional argument to download static files for a particular primitive, matching on its Python path.
  350. ``-o`` optional argument to download the static files into a common folder. If not provided, they are
  351. downloaded into the current directory.
  352. After the download, the file structure is given as follows::
  353. /static/
  354. <file_digest>/
  355. <file>
  356. <file_digest>/
  357. <file>
  358. ...
  359. ...

全栈的自动化机器学习系统,主要针对多变量时间序列数据的异常检测。TODS提供了详尽的用于构建基于机器学习的异常检测系统的模块,它们包括:数据处理(data processing),时间序列处理( time series processing),特征分析(feature analysis),检测算法(detection algorithms),和强化模块( reinforcement module)。这些模块所提供的功能包括常见的数据预处理、时间序列数据的平滑或变换,从时域或频域中抽取特征、多种多样的检测算