Browse Source

Total Cleanup and Restructuring (#248)

* Began the great cleanup of 2020

* Removed pretty much everything else

* Forgot a couple

* Began restructuring

* Moved documentation to it's own project environment

* Added back some old expression based model functionality

* Began working on cleaning up examples

* Continued cleaning up examples

* Added ignoring of virtual environment folders

* More work cleaning up examples

* Stil working on cleaning up examples

* removed last manifest

* Removed unnecessary travis lines

* Updating travis

* travis automatically tests

* Fixed travis documentation typo

* Updated jupyter notebook examples

* Updated dockerfile

* Removed last of the unnecessary files

* Updated README

* Removed all documentation errors

* Added github actions

* Fixed missing light graphs dependency
unit_tests
Micah Halter 1 year ago
committed by GitHub
parent
commit
f85f5febf7
No known key found for this signature in database GPG Key ID: 4AEE18F83AFDEB23
  1. 23
      .github/workflows/doc.yml
  2. 21
      .github/workflows/test.yml
  3. 11
      .gitignore
  4. 21
      .travis.yml
  5. 623
      FluModel.ipynb
  6. 105
      FluModel.jl
  7. 56
      Project.toml
  8. 46
      README.md
  9. 16
      REQUIRE
  10. 717
      SemanticModelsIn10Mins.ipynb
  11. 449
      bin/extract.jl
  12. 109
      bin/extractvars.jl
  13. 57
      bin/transformations.jl
  14. 5
      doc/Project.toml
  15. 11
      doc/make.jl
  16. 23
      doc/src/extraction.md
  17. 51
      doc/src/graph.md
  18. 12
      doc/src/img/layers.dot
  19. 5
      doc/src/index.md
  20. 22
      doc/src/library.md
  21. 120
      doc/src/news.md
  22. 21
      doc/src/slides.md
  23. 32
      doc/src/theory.md
  24. 67
      doc/src/workflow.md
  25. 8
      docker/Dockerfile
  26. 444
      examples/Algebra/Manifest.toml
  27. 3
      examples/Algebra/Project.toml
  28. 12
      examples/ExprModels/Project.toml
  29. 0
      examples/ExprModels/agentbased.jl
  30. 9
      examples/ExprModels/agentgraft.jl
  31. 475
      examples/ExprModels/heiko.ipynb
  32. 10
      examples/ExprModels/modelmacro.jl
  33. 14
      examples/ExprModels/monomial_regression.jl
  34. 13
      examples/ExprModels/multivariate_regression.jl
  35. 18
      examples/ExprModels/odegraft.jl
  36. 33
      examples/ExprModels/polynomial_regression.jl
  37. 12
      examples/ExprModels/pseudo_polynomial_regression.jl
  38. 5
      examples/ExprModels/regression.jl
  39. 30
      examples/ExprModels/valuecapture.jl
  40. 51
      examples/ExprModels/workflow.jl
  41. 160
      examples/agentbased2.jl
  42. 172
      examples/agenttypes.jl
  43. 239
      examples/agenttypes2.jl
  44. 2
      examples/covid/Project.toml
  45. 11
      examples/covid/covid.jl
  46. 14
      examples/covid/odemodel.jl
  47. 10
      examples/covid/validation.jl
  48. 685
      examples/dataflow.jl
  49. 8
      examples/decorations/Project.toml
  50. 9
      examples/decorations/decoration-demo.jl
  51. 10
      examples/decorations/graphs.jl
  52. 6
      examples/decorations/relolog.jl
  53. 112
      examples/definitions.jl
  54. 130
      examples/diffeq.jl
  55. 495
      examples/graph.jl
  56. 1101
      examples/knowledge_graph/Build_Demo_Knowledge_Graph.ipynb
  57. 137
      examples/knowledge_graph/Build_Demo_Knowledge_Graph.jl
  58. 161
      examples/knowledge_graph/Build_Demo_Knowledge_Graph.md
  59. 20
      examples/knowledge_graph/data/kg_edge_types.csv
  60. 7
      examples/knowledge_graph/data/kg_edges.csv
  61. 16
      examples/knowledge_graph/data/kg_schema.csv
  62. 23
      examples/knowledge_graph/data/kg_vertex_types.csv
  63. 20
      examples/knowledge_graph/data/kg_vertices.csv
  64. 13
      examples/knowledge_graph/data/synth_kg_edges.csv
  65. 14
      examples/knowledge_graph/manual/knowledge_elements_edges .csv
  66. 12
      examples/knowledge_graph/manual/knowledge_elements_vertex.csv
  67. 44
      examples/knowledge_graph/manual/relations.csv
  68. 9
      examples/malaria/Project.toml
  69. 2
      examples/malaria/img/birdmal.svg
  70. 0
      examples/malaria/img/birdmal_concept.svg
  71. 0
      examples/malaria/img/birdmal_overview.png
  72. 0
      examples/malaria/img/birdmal_sol_multiepi.pdf
  73. 0
      examples/malaria/img/birdmal_sol_multiepi_zoom.pdf
  74. 0
      examples/malaria/img/birdmal_sol_subcritical.png
  75. 276
      examples/malaria/img/birdmal_wd.svg
  76. 2
      examples/malaria/img/birds.svg
  77. 86
      examples/malaria/img/birds_wd.svg
  78. 0
      examples/malaria/img/foodchain_concept.svg
  79. 0
      examples/malaria/img/foodstar_concept.svg
  80. 0
      examples/malaria/img/lotka_volterra_concept.svg
  81. 2
      examples/malaria/img/lv.svg
  82. 0
      examples/malaria/img/lv_overview.png
  83. 0
      examples/malaria/img/lv_sol.pdf
  84. 82
      examples/malaria/img/lv_wd.svg
  85. 2
      examples/malaria/img/malaria.svg
  86. 0
      examples/malaria/img/malaria_concept.svg
  87. 0
      examples/malaria/img/malaria_labeled.png
  88. 0
      examples/malaria/img/malaria_labeled.svg
  89. 0
      examples/malaria/img/malaria_sol_endemic.png
  90. 0
      examples/malaria/img/malaria_sol_fatal.png
  91. 0
      examples/malaria/malaria.html
  92. 4272
      examples/malaria/malaria.ipynb
  93. 2
      examples/malaria/malaria.jl
  94. 131
      examples/monomial_regression.jl
  95. 6
      examples/petri/Project.toml
  96. 1019
      examples/petri/covid/Manifest.toml
  97. 31
      examples/petri/covid/fallback.jl
  98. 276
      examples/petri/img/birdmal_wd.svg
  99. 86
      examples/petri/img/birds_wd.svg
  100. 82
      examples/petri/img/lv_wd.svg

23
.github/workflows/doc.yml

@ -0,0 +1,23 @@
name: Documentation
on: [push, pull_request]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: "Set up Julia"
uses: julia-actions/setup-julia@latest
with:
version: '1.4'
- name: "Install system dependencies"
run: |
sudo apt-get update
sudo apt-get install graphviz ttf-dejavu
- name: "Install Julia dependencies"
run: julia --project=doc -e 'using Pkg; Pkg.develop(PackageSpec(path=pwd())); Pkg.instantiate();'
- name: "Build and deploy docs"
env:
DOCUMENTER_KEY: ${{ secrets.DOCUMENTER_KEY }}
run: julia --project=doc doc/make.jl

21
.github/workflows/test.yml

@ -0,0 +1,21 @@
name: Tests
on: [push, pull_request]
jobs:
test:
runs-on: ${{ matrix.os }}
strategy:
fail-fast: false
matrix:
julia-version: ['1.0', '1.3', '1.4']
os: [ubuntu-latest]
steps:
- uses: actions/checkout@v2
- name: "Set up Julia"
uses: julia-actions/setup-julia@latest
with:
version: ${{ matrix.julia-version }}
- name: "Run tests"
uses: julia-actions/julia-runtest@master

11
.gitignore

@ -1,4 +1,4 @@
/Manifest.toml
Manifest.toml
*.jl.cov
*.jl.*.cov
*.jl.mem
@ -278,3 +278,12 @@ examples/*.dot.svg
doc/src/img/petri/*.png
/doc/workflow.slides.html
/examples/petri/covid/img/
# Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/

21
.travis.yml

@ -11,24 +11,19 @@ addons:
julia:
- 1.0
matrix:
allow_failures:
- julia: nightly
- 1.4
- nightly
notifications:
email: false
after_success:
# - julia -e 'using Pkg; cd(Pkg.dir("SemanticModels")); Pkg.add("Coverage"); using Coverage; Coveralls.submit(Coveralls.process_folder())';
# - julia -e 'using Pkg; cd(Pkg.dir("SemanticModels")); Pkg.add("Coverage"); using Coverage; Codecov.submit(Codecov.process_folder())';
jobs:
allow_failures:
- julia: nightly
include:
- stage: "Testing"
script:
- travis_wait 30 julia --project -e 'using Pkg; Pkg.build(); Pkg.test("SemanticModels")'
- stage: "Documentation"
julia: 1.0
script:
- julia --project -e 'using Pkg; Pkg.instantiate();'
- julia --project doc/make.jl
- julia --project=doc/ -e 'using Pkg; Pkg.develop(PackageSpec(path=pwd()));
Pkg.instantiate()'
- julia --project=doc/ doc/make.jl

623
FluModel.ipynb

@ -1,623 +0,0 @@
{
"cells": [
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
"using Pkg\n",
"Pkg.activate(\".\")\n",
"using SemanticModels\n",
"using SemanticModels.Unitful: DomainError, s, d, C, uconvert, NoUnits\n",
"using DifferentialEquations\n",
"using DataFrames\n",
"using Unitful\n",
"using Test\n",
"\n",
"using Distributions: Uniform\n",
"using GLM\n",
"using DataFrames\n"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [],
"source": [
"using Plots"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"stripunits (generic function with 1 method)"
]
},
"execution_count": 4,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"stripunits(x) = uconvert(NoUnits, x)"
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"CombinedModel{Array{CombinedModel{Array{SpringModel{Array{Quantity{Float64,๐“^-2,Unitful.FreeUnits{(d^-2,),๐“^-2,nothing}},1},Tuple{Quantity{Int64,๐“,Unitful.FreeUnits{(d,),๐“,nothing}},Quantity{Float64,๐“,Unitful.FreeUnits{(d,),๐“,nothing}}},Array{Quantity{Float64,D,U} where U where D,1}},1},getfield(Main, Symbol(\"#create_sir#3\"))},1},getfield(Main, Symbol(\"#create_flu#4\"))}(CombinedModel{Array{SpringModel{Array{Quantity{Float64,๐“^-2,Unitful.FreeUnits{(d^-2,),๐“^-2,nothing}},1},Tuple{Quantity{Int64,๐“,Unitful.FreeUnits{(d,),๐“,nothing}},Quantity{Float64,๐“,Unitful.FreeUnits{(d,),๐“,nothing}}},Array{Quantity{Float64,D,U} where U where D,1}},1},getfield(Main, Symbol(\"#create_sir#3\"))}[CombinedModel{Array{SpringModel{Array{Quantity{Float64,๐“^-2,FreeUnits{(d^-2,),๐“^-2,nothing}},1},Tuple{Quantity{Int64,๐“,FreeUnits{(d,),๐“,nothing}},Quantity{Float64,๐“,FreeUnits{(d,),๐“,nothing}}},Array{Quantity{Float64,D,U} where U where D,1}},1},#create_sir#3}(SpringModel{Array{Quantity{Float64,๐“^-2,Unitful.FreeUnits{(d^-2,),๐“^-2,nothing}},1},Tuple{Quantity{Int64,๐“,Unitful.FreeUnits{(d,),๐“,nothing}},Quantity{Float64,๐“,Unitful.FreeUnits{(d,),๐“,nothing}}},Array{Quantity{Float64,D,U} where U where D,1}}[SpringModel{Array{Quantity{Float64,๐“^-2,FreeUnits{(d^-2,),๐“^-2,nothing}},1},Tuple{Quantity{Int64,๐“,FreeUnits{(d,),๐“,nothing}},Quantity{Float64,๐“,FreeUnits{(d,),๐“,nothing}}},Array{Quantity{Float64,D,U} where U where D,1}}(Quantity{Float64,๐“^-2,Unitful.FreeUnits{(d^-2,),๐“^-2,nothing}}[0.000342466 d^-2], (0 d, 753.982 d), Quantity{Float64,D,U} where U where D[25.0 C, 0.0 C d^-1])], #create_sir#3())], getfield(Main, Symbol(\"#create_flu#4\"))())"
]
},
"execution_count": 5,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"function flusim(tfinal)\n",
" # annual cycle of temperature control flu infectiousness\n",
" springmodel = SpringModel([u\"(1.0/(365*8))d^-2\"], # parameters (frequency)\n",
" (u\"0d\",tfinal), # time domain\n",
" [u\"25.0C\", u\"0C/d\"]) # initial_conditions T, T'\n",
" function create_sir(m, solns)\n",
" sol = solns[1]\n",
" initialS = u\"10000person\"\n",
" initialI = u\"1person\" \n",
" initialpop = [initialS, initialI, u\"0.0person\"]\n",
" ฮฒ = u\"1.0/18\"/u\"d*C\" * sol(sol.t[end-2])[1] #infectiousness\n",
" @show ฮฒ\n",
" sirprob = SIRSimulation(initialpop, #initial_conditions S,I,R\n",
" (u\"0.0d\", u\"20d\"), #time domain\n",
" SIRParams(ฮฒ, u\"40.0person/d\")) # parameters ฮฒ, ฮณ\n",
" return sirprob\n",
" end\n",
"\n",
" function create_flu(cm, solns)\n",
" sol = solns[1]\n",
" finalI = stripunits(sol(u\"8.0d\")[2]) # X\n",
" population = stripunits(sol(sol.t[end])[2])\n",
" # population = stripunits(sum(sol.u[end]))\n",
" df = SemanticModels.generate_synthetic_data(population, 0,100)\n",
" f = @formula(vaccines_produced ~ flu_patients)\n",
" model = lm(f,\n",
" df[2:length(df.year),\n",
" [:year, :flu_patients, :vaccines_produced]])\n",
" println(\"GLM Model:\")\n",
" println(model)\n",
"\n",
" year_to_predict = 1\n",
" num_flu_patients_from_sim = finalI\n",
" vaccines_produced = missing\n",
" targetDF = DataFrame(year=year_to_predict,\n",
" flu_patients=num_flu_patients_from_sim, \n",
" vaccines_produced=missing)\n",
" @show targetDF\n",
"\n",
"\n",
" return RegressionProblem(f, model, targetDF, missing)\n",
" end\n",
" cm = CombinedModel([springmodel], create_sir)\n",
" flumodel = CombinedModel([cm], create_flu)\n",
" return flumodel\n",
"end\n",
"\n",
"tfinal = 240ฯ€*u\"d\" #(~2 yrs)\n",
"flumodel = flusim(tfinal)\n"
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {},
"outputs": [
{
"data": {
"image/svg+xml": [
"<?xml version=\"1.0\" encoding=\"utf-8\"?>\n",
"<svg xmlns=\"http://www.w3.org/2000/svg\" xmlns:xlink=\"http://www.w3.org/1999/xlink\" width=\"600\" height=\"400\" viewBox=\"0 0 2400 1600\">\n",
"<defs>\n",
" <clipPath id=\"clip9700\">\n",
" <rect x=\"0\" y=\"0\" width=\"2000\" height=\"2000\"/>\n",
" </clipPath>\n",
"</defs>\n",
"<defs>\n",
" <clipPath id=\"clip9701\">\n",
" <rect x=\"0\" y=\"0\" width=\"2400\" height=\"1600\"/>\n",
" </clipPath>\n",
"</defs>\n",
"<polygon clip-path=\"url(#clip9701)\" points=\"\n",
"0,1600 2400,1600 2400,0 0,0 \n",
" \" fill=\"#ffffff\" fill-rule=\"evenodd\" fill-opacity=\"1\"/>\n",
"<defs>\n",
" <clipPath id=\"clip9702\">\n",
" <rect x=\"480\" y=\"0\" width=\"1681\" height=\"1600\"/>\n",
" </clipPath>\n",
"</defs>\n",
"<polygon clip-path=\"url(#clip9701)\" points=\"\n",
"176.123,1503.47 2321.26,1503.47 2321.26,47.2441 176.123,47.2441 \n",
" \" fill=\"#ffffff\" fill-rule=\"evenodd\" fill-opacity=\"1\"/>\n",
"<defs>\n",
" <clipPath id=\"clip9703\">\n",
" <rect x=\"176\" y=\"47\" width=\"2146\" height=\"1457\"/>\n",
" </clipPath>\n",
"</defs>\n",
"<polyline clip-path=\"url(#clip9703)\" style=\"stroke:#000000; stroke-width:2; stroke-opacity:0.1; fill:none\" points=\"\n",
" 236.834,1503.47 236.834,47.2441 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9703)\" style=\"stroke:#000000; stroke-width:2; stroke-opacity:0.1; fill:none\" points=\"\n",
" 773.641,1503.47 773.641,47.2441 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9703)\" style=\"stroke:#000000; stroke-width:2; stroke-opacity:0.1; fill:none\" points=\"\n",
" 1310.45,1503.47 1310.45,47.2441 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9703)\" style=\"stroke:#000000; stroke-width:2; stroke-opacity:0.1; fill:none\" points=\"\n",
" 1847.25,1503.47 1847.25,47.2441 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9703)\" style=\"stroke:#000000; stroke-width:2; stroke-opacity:0.1; fill:none\" points=\"\n",
" 176.123,1333.66 2321.26,1333.66 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9703)\" style=\"stroke:#000000; stroke-width:2; stroke-opacity:0.1; fill:none\" points=\"\n",
" 176.123,1056.95 2321.26,1056.95 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9703)\" style=\"stroke:#000000; stroke-width:2; stroke-opacity:0.1; fill:none\" points=\"\n",
" 176.123,780.236 2321.26,780.236 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9703)\" style=\"stroke:#000000; stroke-width:2; stroke-opacity:0.1; fill:none\" points=\"\n",
" 176.123,503.525 2321.26,503.525 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9703)\" style=\"stroke:#000000; stroke-width:2; stroke-opacity:0.1; fill:none\" points=\"\n",
" 176.123,226.814 2321.26,226.814 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9701)\" style=\"stroke:#000000; stroke-width:4; stroke-opacity:1; fill:none\" points=\"\n",
" 176.123,1503.47 2321.26,1503.47 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9701)\" style=\"stroke:#000000; stroke-width:4; stroke-opacity:1; fill:none\" points=\"\n",
" 176.123,1503.47 176.123,47.2441 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9701)\" style=\"stroke:#000000; stroke-width:4; stroke-opacity:1; fill:none\" points=\"\n",
" 236.834,1503.47 236.834,1481.63 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9701)\" style=\"stroke:#000000; stroke-width:4; stroke-opacity:1; fill:none\" points=\"\n",
" 773.641,1503.47 773.641,1481.63 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9701)\" style=\"stroke:#000000; stroke-width:4; stroke-opacity:1; fill:none\" points=\"\n",
" 1310.45,1503.47 1310.45,1481.63 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9701)\" style=\"stroke:#000000; stroke-width:4; stroke-opacity:1; fill:none\" points=\"\n",
" 1847.25,1503.47 1847.25,1481.63 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9701)\" style=\"stroke:#000000; stroke-width:4; stroke-opacity:1; fill:none\" points=\"\n",
" 176.123,1333.66 208.3,1333.66 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9701)\" style=\"stroke:#000000; stroke-width:4; stroke-opacity:1; fill:none\" points=\"\n",
" 176.123,1056.95 208.3,1056.95 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9701)\" style=\"stroke:#000000; stroke-width:4; stroke-opacity:1; fill:none\" points=\"\n",
" 176.123,780.236 208.3,780.236 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9701)\" style=\"stroke:#000000; stroke-width:4; stroke-opacity:1; fill:none\" points=\"\n",
" 176.123,503.525 208.3,503.525 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9701)\" style=\"stroke:#000000; stroke-width:4; stroke-opacity:1; fill:none\" points=\"\n",
" 176.123,226.814 208.3,226.814 \n",
" \"/>\n",
"<g clip-path=\"url(#clip9701)\">\n",
"<text style=\"fill:#000000; fill-opacity:1; font-family:Arial,Helvetica Neue,Helvetica,sans-serif; font-size:48px; text-anchor:middle;\" transform=\"rotate(0, 236.834, 1557.47)\" x=\"236.834\" y=\"1557.47\">0</text>\n",
"</g>\n",
"<g clip-path=\"url(#clip9701)\">\n",
"<text style=\"fill:#000000; fill-opacity:1; font-family:Arial,Helvetica Neue,Helvetica,sans-serif; font-size:48px; text-anchor:middle;\" transform=\"rotate(0, 773.641, 1557.47)\" x=\"773.641\" y=\"1557.47\">200</text>\n",
"</g>\n",
"<g clip-path=\"url(#clip9701)\">\n",
"<text style=\"fill:#000000; fill-opacity:1; font-family:Arial,Helvetica Neue,Helvetica,sans-serif; font-size:48px; text-anchor:middle;\" transform=\"rotate(0, 1310.45, 1557.47)\" x=\"1310.45\" y=\"1557.47\">400</text>\n",
"</g>\n",
"<g clip-path=\"url(#clip9701)\">\n",
"<text style=\"fill:#000000; fill-opacity:1; font-family:Arial,Helvetica Neue,Helvetica,sans-serif; font-size:48px; text-anchor:middle;\" transform=\"rotate(0, 1847.25, 1557.47)\" x=\"1847.25\" y=\"1557.47\">600</text>\n",
"</g>\n",
"<g clip-path=\"url(#clip9701)\">\n",
"<text style=\"fill:#000000; fill-opacity:1; font-family:Arial,Helvetica Neue,Helvetica,sans-serif; font-size:48px; text-anchor:end;\" transform=\"rotate(0, 152.123, 1351.16)\" x=\"152.123\" y=\"1351.16\">-20</text>\n",
"</g>\n",
"<g clip-path=\"url(#clip9701)\">\n",
"<text style=\"fill:#000000; fill-opacity:1; font-family:Arial,Helvetica Neue,Helvetica,sans-serif; font-size:48px; text-anchor:end;\" transform=\"rotate(0, 152.123, 1074.45)\" x=\"152.123\" y=\"1074.45\">-10</text>\n",
"</g>\n",
"<g clip-path=\"url(#clip9701)\">\n",
"<text style=\"fill:#000000; fill-opacity:1; font-family:Arial,Helvetica Neue,Helvetica,sans-serif; font-size:48px; text-anchor:end;\" transform=\"rotate(0, 152.123, 797.736)\" x=\"152.123\" y=\"797.736\">0</text>\n",
"</g>\n",
"<g clip-path=\"url(#clip9701)\">\n",
"<text style=\"fill:#000000; fill-opacity:1; font-family:Arial,Helvetica Neue,Helvetica,sans-serif; font-size:48px; text-anchor:end;\" transform=\"rotate(0, 152.123, 521.025)\" x=\"152.123\" y=\"521.025\">10</text>\n",
"</g>\n",
"<g clip-path=\"url(#clip9701)\">\n",
"<text style=\"fill:#000000; fill-opacity:1; font-family:Arial,Helvetica Neue,Helvetica,sans-serif; font-size:48px; text-anchor:end;\" transform=\"rotate(0, 152.123, 244.314)\" x=\"152.123\" y=\"244.314\">20</text>\n",
"</g>\n",
"<polyline clip-path=\"url(#clip9703)\" style=\"stroke:#009af9; stroke-width:4; stroke-opacity:1; fill:none\" points=\"\n",
" 236.834,88.4582 237.148,88.4598 240.283,88.6537 266.677,103.051 328.128,221.037 416.15,553.155 516.443,1022.02 620.684,1388.97 749.289,1419.63 872.032,1006.26 \n",
" 1018.94,345.213 1158.63,90.2708 1314.44,495.532 1471.5,1203.71 1628.17,1462.26 1793.71,958.911 1961.71,239.619 2129.05,166.646 2260.55,653.591 \n",
" \"/>\n",
"<polygon clip-path=\"url(#clip9701)\" points=\"\n",
"1958.43,251.724 2249.26,251.724 2249.26,130.764 1958.43,130.764 \n",
" \" fill=\"#ffffff\" fill-rule=\"evenodd\" fill-opacity=\"1\"/>\n",
"<polyline clip-path=\"url(#clip9701)\" style=\"stroke:#000000; stroke-width:4; stroke-opacity:1; fill:none\" points=\"\n",
" 1958.43,251.724 2249.26,251.724 2249.26,130.764 1958.43,130.764 1958.43,251.724 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9701)\" style=\"stroke:#009af9; stroke-width:4; stroke-opacity:1; fill:none\" points=\"\n",
" 1982.43,191.244 2126.43,191.244 \n",
" \"/>\n",
"<g clip-path=\"url(#clip9701)\">\n",
"<text style=\"fill:#000000; fill-opacity:1; font-family:Arial,Helvetica Neue,Helvetica,sans-serif; font-size:48px; text-anchor:start;\" transform=\"rotate(0, 2150.43, 208.744)\" x=\"2150.43\" y=\"208.744\">y1</text>\n",
"</g>\n",
"</svg>\n"
]
},
"execution_count": 6,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"springmodel = flumodel.deps[1].deps[1]\n",
"sirmodel = flumodel.deps[1]\n",
"sol = solve(springmodel)\n",
"plot(sol.t./d, map(x->x[1], sol.u) ./ C)"
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"ฮฒ = 1.0854014119706108 d^-1\n"
]
},
{
"data": {
"text/plain": [
"retcode: Success\n",
"Interpolation: specialized 9th order lazy interpolation\n",
"t: 16-element Array{Quantity{Float64,๐“,Unitful.FreeUnits{(d,),๐“,nothing}},1}:\n",
" 0.0 d\n",
" 0.2503747748877873 d\n",
" 0.9420079061012359 d\n",
" 1.9002912698756584 d\n",
" 2.972126338526902 d\n",
" 4.147568957524303 d\n",
" 5.418240373678765 d\n",
" 6.7948860959760315 d\n",
" 8.400900825017708 d\n",
" 10.236958450599571 d\n",
" 12.30882380127413 d\n",
" 13.728455901957311 d\n",
" 15.710893594828972 d\n",
" 17.26796601078083 d\n",
" 19.522036174270298 d\n",
" 20.0 d\n",
"u: 16-element Array{Array{Quantity{Float64,NoDims,Unitful.FreeUnits{(person,),NoDims,nothing}},1},1}:\n",
" [10000.0 person, 1.0 person, 0.0 person] \n",
" [9999.69 person, 1.31091 person, 0.00115006 person]\n",
" [9998.22 person, 2.76907 person, 0.0065442 person] \n",
" [9993.17 person, 7.80133 person, 0.0251661 person] \n",
" [9976.09 person, 24.8203 person, 0.0882149 person] \n",
" [9912.76 person, 87.9172 person, 0.322914 person] \n",
" [9661.17 person, 338.559 person, 1.27032 person] \n",
" [8652.58 person, 1343.08 person, 5.33359 person] \n",
" [5308.5 person, 4669.16 person, 23.338 person] \n",
" [1351.18 person, 8576.06 person, 73.7647 person] \n",
" [166.552 person, 9683.51 person, 150.933 person] \n",
" [37.1606 person, 9757.62 person, 206.214 person] \n",
" [4.57084 person, 9712.98 person, 283.447 person] \n",
" [0.889612 person, 9656.35 person, 343.763 person] \n",
" [0.0847902 person, 9570.48 person, 430.432 person] \n",
" [0.0516352 person, 9552.24 person, 448.71 person] "
]
},
"execution_count": 7,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"sirsol = solve(sirmodel)"
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {},
"outputs": [
{
"data": {
"image/svg+xml": [
"<?xml version=\"1.0\" encoding=\"utf-8\"?>\n",
"<svg xmlns=\"http://www.w3.org/2000/svg\" xmlns:xlink=\"http://www.w3.org/1999/xlink\" width=\"600\" height=\"400\" viewBox=\"0 0 2400 1600\">\n",
"<defs>\n",
" <clipPath id=\"clip9900\">\n",
" <rect x=\"0\" y=\"0\" width=\"2000\" height=\"2000\"/>\n",
" </clipPath>\n",
"</defs>\n",
"<defs>\n",
" <clipPath id=\"clip9901\">\n",
" <rect x=\"0\" y=\"0\" width=\"2400\" height=\"1600\"/>\n",
" </clipPath>\n",
"</defs>\n",
"<polygon clip-path=\"url(#clip9901)\" points=\"\n",
"0,1600 2400,1600 2400,0 0,0 \n",
" \" fill=\"#ffffff\" fill-rule=\"evenodd\" fill-opacity=\"1\"/>\n",
"<defs>\n",
" <clipPath id=\"clip9902\">\n",
" <rect x=\"480\" y=\"0\" width=\"1681\" height=\"1600\"/>\n",
" </clipPath>\n",
"</defs>\n",
"<polygon clip-path=\"url(#clip9901)\" points=\"\n",
"228.3,1503.47 2321.26,1503.47 2321.26,47.2441 228.3,47.2441 \n",
" \" fill=\"#ffffff\" fill-rule=\"evenodd\" fill-opacity=\"1\"/>\n",
"<defs>\n",
" <clipPath id=\"clip9903\">\n",
" <rect x=\"228\" y=\"47\" width=\"2094\" height=\"1457\"/>\n",
" </clipPath>\n",
"</defs>\n",
"<polyline clip-path=\"url(#clip9903)\" style=\"stroke:#000000; stroke-width:2; stroke-opacity:0.1; fill:none\" points=\"\n",
" 287.535,1503.47 287.535,47.2441 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9903)\" style=\"stroke:#000000; stroke-width:2; stroke-opacity:0.1; fill:none\" points=\"\n",
" 781.157,1503.47 781.157,47.2441 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9903)\" style=\"stroke:#000000; stroke-width:2; stroke-opacity:0.1; fill:none\" points=\"\n",
" 1274.78,1503.47 1274.78,47.2441 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9903)\" style=\"stroke:#000000; stroke-width:2; stroke-opacity:0.1; fill:none\" points=\"\n",
" 1768.4,1503.47 1768.4,47.2441 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9903)\" style=\"stroke:#000000; stroke-width:2; stroke-opacity:0.1; fill:none\" points=\"\n",
" 2262.03,1503.47 2262.03,47.2441 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9903)\" style=\"stroke:#000000; stroke-width:2; stroke-opacity:0.1; fill:none\" points=\"\n",
" 228.3,1462.4 2321.26,1462.4 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9903)\" style=\"stroke:#000000; stroke-width:2; stroke-opacity:0.1; fill:none\" points=\"\n",
" 228.3,1110.38 2321.26,1110.38 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9903)\" style=\"stroke:#000000; stroke-width:2; stroke-opacity:0.1; fill:none\" points=\"\n",
" 228.3,758.366 2321.26,758.366 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9903)\" style=\"stroke:#000000; stroke-width:2; stroke-opacity:0.1; fill:none\" points=\"\n",
" 228.3,406.348 2321.26,406.348 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9903)\" style=\"stroke:#000000; stroke-width:2; stroke-opacity:0.1; fill:none\" points=\"\n",
" 228.3,54.33 2321.26,54.33 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9901)\" style=\"stroke:#000000; stroke-width:4; stroke-opacity:1; fill:none\" points=\"\n",
" 228.3,1503.47 2321.26,1503.47 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9901)\" style=\"stroke:#000000; stroke-width:4; stroke-opacity:1; fill:none\" points=\"\n",
" 228.3,1503.47 228.3,47.2441 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9901)\" style=\"stroke:#000000; stroke-width:4; stroke-opacity:1; fill:none\" points=\"\n",
" 287.535,1503.47 287.535,1481.63 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9901)\" style=\"stroke:#000000; stroke-width:4; stroke-opacity:1; fill:none\" points=\"\n",
" 781.157,1503.47 781.157,1481.63 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9901)\" style=\"stroke:#000000; stroke-width:4; stroke-opacity:1; fill:none\" points=\"\n",
" 1274.78,1503.47 1274.78,1481.63 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9901)\" style=\"stroke:#000000; stroke-width:4; stroke-opacity:1; fill:none\" points=\"\n",
" 1768.4,1503.47 1768.4,1481.63 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9901)\" style=\"stroke:#000000; stroke-width:4; stroke-opacity:1; fill:none\" points=\"\n",
" 2262.03,1503.47 2262.03,1481.63 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9901)\" style=\"stroke:#000000; stroke-width:4; stroke-opacity:1; fill:none\" points=\"\n",
" 228.3,1462.4 259.694,1462.4 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9901)\" style=\"stroke:#000000; stroke-width:4; stroke-opacity:1; fill:none\" points=\"\n",
" 228.3,1110.38 259.694,1110.38 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9901)\" style=\"stroke:#000000; stroke-width:4; stroke-opacity:1; fill:none\" points=\"\n",
" 228.3,758.366 259.694,758.366 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9901)\" style=\"stroke:#000000; stroke-width:4; stroke-opacity:1; fill:none\" points=\"\n",
" 228.3,406.348 259.694,406.348 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9901)\" style=\"stroke:#000000; stroke-width:4; stroke-opacity:1; fill:none\" points=\"\n",
" 228.3,54.33 259.694,54.33 \n",
" \"/>\n",
"<g clip-path=\"url(#clip9901)\">\n",
"<text style=\"fill:#000000; fill-opacity:1; font-family:Arial,Helvetica Neue,Helvetica,sans-serif; font-size:48px; text-anchor:middle;\" transform=\"rotate(0, 287.535, 1557.47)\" x=\"287.535\" y=\"1557.47\">0</text>\n",
"</g>\n",
"<g clip-path=\"url(#clip9901)\">\n",
"<text style=\"fill:#000000; fill-opacity:1; font-family:Arial,Helvetica Neue,Helvetica,sans-serif; font-size:48px; text-anchor:middle;\" transform=\"rotate(0, 781.157, 1557.47)\" x=\"781.157\" y=\"1557.47\">5</text>\n",
"</g>\n",
"<g clip-path=\"url(#clip9901)\">\n",
"<text style=\"fill:#000000; fill-opacity:1; font-family:Arial,Helvetica Neue,Helvetica,sans-serif; font-size:48px; text-anchor:middle;\" transform=\"rotate(0, 1274.78, 1557.47)\" x=\"1274.78\" y=\"1557.47\">10</text>\n",
"</g>\n",
"<g clip-path=\"url(#clip9901)\">\n",
"<text style=\"fill:#000000; fill-opacity:1; font-family:Arial,Helvetica Neue,Helvetica,sans-serif; font-size:48px; text-anchor:middle;\" transform=\"rotate(0, 1768.4, 1557.47)\" x=\"1768.4\" y=\"1557.47\">15</text>\n",
"</g>\n",
"<g clip-path=\"url(#clip9901)\">\n",
"<text style=\"fill:#000000; fill-opacity:1; font-family:Arial,Helvetica Neue,Helvetica,sans-serif; font-size:48px; text-anchor:middle;\" transform=\"rotate(0, 2262.03, 1557.47)\" x=\"2262.03\" y=\"1557.47\">20</text>\n",
"</g>\n",
"<g clip-path=\"url(#clip9901)\">\n",
"<text style=\"fill:#000000; fill-opacity:1; font-family:Arial,Helvetica Neue,Helvetica,sans-serif; font-size:48px; text-anchor:end;\" transform=\"rotate(0, 204.3, 1479.9)\" x=\"204.3\" y=\"1479.9\">0</text>\n",
"</g>\n",
"<g clip-path=\"url(#clip9901)\">\n",
"<text style=\"fill:#000000; fill-opacity:1; font-family:Arial,Helvetica Neue,Helvetica,sans-serif; font-size:48px; text-anchor:end;\" transform=\"rotate(0, 204.3, 1127.88)\" x=\"204.3\" y=\"1127.88\">2500</text>\n",
"</g>\n",
"<g clip-path=\"url(#clip9901)\">\n",
"<text style=\"fill:#000000; fill-opacity:1; font-family:Arial,Helvetica Neue,Helvetica,sans-serif; font-size:48px; text-anchor:end;\" transform=\"rotate(0, 204.3, 775.866)\" x=\"204.3\" y=\"775.866\">5000</text>\n",
"</g>\n",
"<g clip-path=\"url(#clip9901)\">\n",
"<text style=\"fill:#000000; fill-opacity:1; font-family:Arial,Helvetica Neue,Helvetica,sans-serif; font-size:48px; text-anchor:end;\" transform=\"rotate(0, 204.3, 423.848)\" x=\"204.3\" y=\"423.848\">7500</text>\n",
"</g>\n",
"<g clip-path=\"url(#clip9901)\">\n",
"<text style=\"fill:#000000; fill-opacity:1; font-family:Arial,Helvetica Neue,Helvetica,sans-serif; font-size:48px; text-anchor:end;\" transform=\"rotate(0, 204.3, 71.83)\" x=\"204.3\" y=\"71.83\">10000</text>\n",
"</g>\n",
"<polyline clip-path=\"url(#clip9903)\" style=\"stroke:#009af9; stroke-width:4; stroke-opacity:1; fill:none\" points=\"\n",
" 287.535,1462.26 312.253,1462.22 380.534,1462.01 475.14,1461.3 580.956,1458.91 697.001,1450.02 822.448,1414.73 958.357,1273.29 1116.91,804.95 1298.17,254.831 \n",
" 1502.72,98.8934 1642.87,88.4582 1838.59,94.7442 1992.31,102.719 2214.84,114.809 2262.03,117.378 \n",
" \"/>\n",
"<polygon clip-path=\"url(#clip9901)\" points=\"\n",
"1958.43,251.724 2249.26,251.724 2249.26,130.764 1958.43,130.764 \n",
" \" fill=\"#ffffff\" fill-rule=\"evenodd\" fill-opacity=\"1\"/>\n",
"<polyline clip-path=\"url(#clip9901)\" style=\"stroke:#000000; stroke-width:4; stroke-opacity:1; fill:none\" points=\"\n",
" 1958.43,251.724 2249.26,251.724 2249.26,130.764 1958.43,130.764 1958.43,251.724 \n",
" \"/>\n",
"<polyline clip-path=\"url(#clip9901)\" style=\"stroke:#009af9; stroke-width:4; stroke-opacity:1; fill:none\" points=\"\n",
" 1982.43,191.244 2126.43,191.244 \n",
" \"/>\n",
"<g clip-path=\"url(#clip9901)\">\n",
"<text style=\"fill:#000000; fill-opacity:1; font-family:Arial,Helvetica Neue,Helvetica,sans-serif; font-size:48px; text-anchor:start;\" transform=\"rotate(0, 2150.43, 208.744)\" x=\"2150.43\" y=\"208.744\">y1</text>\n",
"</g>\n",
"</svg>\n"
]
},
"execution_count": 8,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"plot(sirsol.t./d,map(x->stripunits.(x)[2], sirsol.u))"
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"ฮฒ = 1.0854014119706108 d^-1\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"โ”Œ Warning: In the future eachcol will have names argument set to false by default\n",
"โ”‚ caller = evalcontrasts(::DataFrame, ::Dict{Any,Any}) at modelframe.jl:124\n",
"โ”” @ StatsModels /Users/jamesfairbanks/.julia/packages/StatsModels/AYB2E/src/modelframe.jl:124\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"GLM Model:\n",
"StatsModels.DataFrameRegressionModel{LinearModel{LmResp{Array{Float64,1}},DensePredChol{Float64,LinearAlgebra.Cholesky{Float64,Array{Float64,2}}}},Array{Float64,2}}\n",
"\n",
"Formula: vaccines_produced ~ 1 + flu_patients\n",
"\n",
"Coefficients:\n",
" Estimate Std.Error t value Pr(>|t|)\n",
"(Intercept) 4892.06 342.292 14.2921 <1e-24\n",
"flu_patients -0.00529781 0.0655031 -0.0808788 0.9357\n",
"\n",
"targetDF = 1ร—3 DataFrame\n",
"โ”‚ Row โ”‚ year โ”‚ flu_patients โ”‚ vaccines_produced โ”‚\n",
"โ”‚ โ”‚ Int64 โ”‚ Float64 โ”‚ Missing โ”‚\n",
"โ”œโ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ค\n",
"โ”‚ 1 โ”‚ 1 โ”‚ 3627.46 โ”‚ missing โ”‚\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"โ”Œ Warning: In the future eachcol will have names argument set to false by default\n",
"โ”‚ caller = evalcontrasts(::DataFrame, ::Dict{Symbol,StatsModels.ContrastsMatrix}) at modelframe.jl:124\n",
"โ”” @ StatsModels /Users/jamesfairbanks/.julia/packages/StatsModels/AYB2E/src/modelframe.jl:124\n"
]
},
{
"data": {
"text/plain": [
"1-element Array{Union{Missing, Float64},1}:\n",
" 4872.838053691122"
]
},
"execution_count": 9,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"sol = solve(flumodel)"
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {},
"outputs": [
{
"ename": "ErrorException",
"evalue": "type Array has no field u",
"output_type": "error",
"traceback": [
"type Array has no field u",
"",
"Stacktrace:",
" [1] getproperty(::Any, ::Symbol) at ./sysimg.jl:18",
" [2] top-level scope at In[10]:1"
]
}
],
"source": [
"print(typeof(sol.u))"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"jupytext": {
"formats": "ipynb,jl:light",
"text_representation": {
"extension": ".jl",
"format_name": "light",
"format_version": "1.3",
"jupytext_version": "0.8.6"
}
},
"kernel_info": {
"name": "julia-1.0"
},
"kernelspec": {
"display_name": "Julia 1.0.0",
"language": "julia",
"name": "julia-1.0"
},
"language_info": {
"file_extension": ".jl",
"mimetype": "application/julia",
"name": "julia",
"version": "1.0.2"
},
"nteract": {
"version": "0.12.3"
}
},
"nbformat": 4,
"nbformat_minor": 2
}

105
FluModel.jl

@ -1,105 +0,0 @@
# -*- coding: utf-8 -*-
# ---
# jupyter:
# jupytext:
# formats: ipynb,jl:light
# text_representation:
# extension: .jl
# format_name: light
# format_version: '1.3'
# jupytext_version: 0.8.6
# kernel_info:
# name: julia-1.0
# kernelspec:
# display_name: Julia 1.0.0
# language: julia
# name: julia-1.0
# ---
# +
using Pkg
Pkg.activate(".")
using SemanticModels
using SemanticModels.Unitful: DomainError, s, d, C, uconvert, NoUnits
using DifferentialEquations
using DataFrames
using Unitful
using Test
using Distributions: Uniform
using GLM
using DataFrames
# -
using Plots
stripunits(x) = uconvert(NoUnits, x)
# +
function flusim(tfinal)
# annual cycle of temperature control flu infectiousness
springmodel = SpringModel([u"(1.0/(365*8))d^-2"], # parameters (frequency)
(u"0d",tfinal), # time domain
[u"25.0C", u"0C/d"]) # initial_conditions T, T'
function create_sir(m, solns)
sol = solns[1]
initialS = u"10000person"
initialI = u"1person"
initialpop = [initialS, initialI, u"0.0person"]
ฮฒ = u"1.0/18"/u"d*C" * sol(sol.t[end-2])[1] #infectiousness
@show ฮฒ
sirprob = SIRSimulation(initialpop, #initial_conditions S,I,R
(u"0.0d", u"20d"), #time domain
SIRParams(ฮฒ, u"40.0person/d")) # parameters ฮฒ, ฮณ
return sirprob
end
function create_flu(cm, solns)
sol = solns[1]
finalI = stripunits(sol(u"8.0d")[2]) # X
population = stripunits(sol(sol.t[end])[2])
# population = stripunits(sum(sol.u[end]))
df = SemanticModels.generate_synthetic_data(population, 0,100)
f = @formula(vaccines_produced ~ flu_patients)
model = lm(f,
df[2:length(df.year),
[:year, :flu_patients, :vaccines_produced]])
println("GLM Model:")
println(model)
year_to_predict = 1
num_flu_patients_from_sim = finalI
vaccines_produced = missing
targetDF = DataFrame(year=year_to_predict,
flu_patients=num_flu_patients_from_sim,
vaccines_produced=missing)
@show targetDF
return RegressionProblem(f, model, targetDF, missing)
end
cm = CombinedModel([springmodel], create_sir)
flumodel = CombinedModel([cm], create_flu)
return flumodel
end
tfinal = 240ฯ€*u"d" #(~2 yrs)
flumodel = flusim(tfinal)
# -
springmodel = flumodel.deps[1].deps[1]
sirmodel = flumodel.deps[1]
sol = solve(springmodel)
plot(sol.t./d, map(x->x[1], sol.u) ./ C)
sirsol = solve(sirmodel)
plot(sirsol.t./d,map(x->stripunits.(x)[2], sirsol.u))
sol = solve(flumodel)
print(typeof(sol.u))

56
Project.toml

@ -4,71 +4,17 @@ authors = ["James Fairbanks <james.fairbanks@gtri.gatech.edu>", "Micah Halter <m
version = "0.3.0"
[deps]
BoundaryValueDiffEq = "764a87c0-6b3e-53db-9096-fe964310641d"
CSV = "336ed68f-0bac-5ca0-87d4-7b16caf5d00b"
Cassette = "7057c7e9-c182-5462-911a-8362d720325c"
Catlab = "134e5e36-593f-5add-ad60-77f754baafbe"
Compose = "a81c6b42-2e10-5240-aca2-a61377ecd94b"
DataFrames = "a93c6f00-e57d-5684-b7b6-d8193f3e46c0"
DataFramesMeta = "1313f7d8-7da2-5740-9ea0-a2ca25f37964"
DelimitedFiles = "8bb1440f-4735-579b-a4ab-409b98df4dab"
Distributions = "31c24e10-a181-5473-b8eb-7969acd0382f"
Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4"
GLM = "38e38edf-8417-5370-95a0-9cbb8c7f171a"
GeneralizedGenerated = "6b9d7cbe-bcb9-11e9-073f-15a7a543e2eb"
GraphDataFrameBridge = "3c71623a-a715-5176-9801-629b201a4880"
JSON = "682c06a0-de6a-54ab-a142-c8b1cf79cde6"
LabelledArrays = "2ee39098-c373-598a-b85f-a56591580800"
Latexify = "23fbe1c1-3f47-55db-b15f-69d7ec21a316"
LightGraphs = "093fc24a-ae57-5d10-9952-331d41423f4d"
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
Logging = "56ddb016-857b-54e1-b83d-db4d58db5568"
LsqFit = "2fda8390-95c7-5789-9bda-21331edee243"
MacroTools = "1914dd2f-81c6-5fcd-8719-6d5c9610ff09"
MetaGraphs = "626554b9-1ddb-594c-aa3c-2596fe9399a5"
ModelingToolkit = "961ee093-0014-501f-94e3-6117800e7a78"
OrdinaryDiffEq = "1dea7af3-3e70-54e6-95c3-0bf5283fa5ed"
Petri = "4259d249-1051-49fa-8328-3f8ab9391c33"
Plots = "91a5bcdd-55d7-5caf-9e0b-520d859cae80"
Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
StaticArrays = "90137ffa-7385-5640-81b9-e52037218182"
TikzPictures = "37f6aa50-8035-52d0-81c2-5a1d08754b2d"
Unitful = "1986cc42-f94f-5a68-af5c-568840ba703d"
[compat]
CSV = "<1.0"
Cassette = "<1.0"
Catlab = "<1.0"
DataFrames = "<1.0"
DataFramesMeta = "<1.0"
DelimitedFiles = "<2.0"
Distributions = "0.23.0"
Documenter = "<1.0"
GLM = "<2.0"
GeneralizedGenerated = "0.1.4"
GraphDataFrameBridge = "<1.0"
JSON = "<1.0"
Latexify = "<1.0"
LightGraphs = "<2.0"
LinearAlgebra = "<2.0"
Logging = "<2.0"
LsqFit = "<1.0"
MacroTools = "<1.0"
MetaGraphs = "<1.0"
Petri = "0.1.3"
Plots = "<1.0"
Random = "<2.0"
StaticArrays = "0.11"
Unitful = "<1.0"
julia = "1.0"
[extras]
DiffEqBase = "2b5f629d-d688-5b77-993f-72d75c75574e"
OrdinaryDiffEq = "1dea7af3-3e70-54e6-95c3-0bf5283fa5ed"
Polynomials = "f27b6e38-b328-58d1-80ce-0feddd5e7a45"
Printf = "de0858da-6303-5e67-8744-51eddeeeb8d7"
Statistics = "10745b16-79ce-11e8-11f9-7d13ad32a3b2"
Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"
[targets]
test = ["OrdinaryDiffEq", "Polynomials", "Printf", "Statistics", "Test"]
test = ["Test"]

46
README.md

@ -44,10 +44,6 @@ Pkg.test("SemanticModels")
```
Note that running the tests for the first time can take a while because `DifferentialEquations` is a large library that
requires a long precompilation step. Various functions in the `SemanticModels.Dubstep` module can also have long
precompile times, due to heavy use of generated functions.
Then you can load it at the julia REPL with `using SemanticModels`
You should start exploring the notebooks in the examples folder. These notebooks are represented in jupytext format,
@ -56,7 +52,7 @@ and are stored as julia programs you can run at the repl or in the notebook inte
1. Model augmentation: an example script `examples/decorations/graphs.jl` shows how to augment an agent based simulation to add new
modeling components using an API for changing models at the semantic level.
2. Model Representations: SemanticModels supports extracting diagram representations of scripts and creating scripts from wiring diagram representations. See the `examples/petri/malaria.ipynb` notebook for a demonstration, as well as expanding on model augmentation by combining and composing models to build a more complex simulation.
2. Model Representations: SemanticModels supports extracting diagram representations of scripts and creating scripts from wiring diagram representations. See the `examples/malaria/malaria.ipynb` notebook for a demonstration, as well as expanding on model augmentation by combining and composing models to build a more complex simulation.
There are scripts in the folder `SemanticModels/bin` which provide command line access to some functionality of the
@ -92,27 +88,20 @@ In addition to the examples in the documentation, there are fully worked out exa
https://github.com/jpfairbanks/SemanticModels.jl/tree/master/examples/. Each subdirectory represents one self contained
example, starting with `epicookbook`.
## Concepts
Here is a preview of the concepts used in SemanticModels, please see the full documentation for a more thorough description.
### Model Augmentation
## Model Augmentation
The primary usecase for SemanticModels.jl is to assist scientists in what we call *model augmentation*. This is the
process of taking a known model developed by another researcher (potentially a past version of yourself) and
transforming the model to create a novel model. This process can help fit an existing theory to new data, explore
alternative hypotheses about the mechanisms of a natural phenomena, or conduct counterfactual thought experiments.
SemanticModels.ModelTool is the current home for this capability.
You can call `m = ModelTool.model(ExpAgentProblem, expr)` to lift an agent based model up to the semantic level, then apply
transformations on that `m` and then call `eval(m.expr)` to generate code for that new model. This allows you to compare
SemanticModels is the current home for this capability.
You can call `m = model(PetriModel, model)` to lift a `Petri.jl` based model up to the semantic level, then apply
transformations on that `m` and then call `solve(m)` to generate code for that new model. This allows you to compare
different variations on a theme to conduct your research.
If you are working with `ODEProblem` or agent based odels, there are prebuilt types for representing those models, but
if you want to add a new class of models you will just have to write:
1. a struct `T` that holds a structured representation of instances of the model class
2. extend `ModelTool.model(::DataType{T}, expr::Expr)` to extract that information from a code representation of your model
2. extend `model(::DataType{T}, model::Model{R})` to extract that information from a model generated by a specific DSL, and lift it to the semantic level
3. a set of *valid transforms* that can be done to your model.
SemanticModels.jl provides library functions to help with steps 2 and 3 and functions for executing and comparing then
@ -121,29 +110,6 @@ outputs of different variations of the model.
We think of SemanticModels as a _post hoc_ modeling framework the enters the scene after scientific code has been
written. As opposed to a standard modeling framework that you develop before you write the scientific code.
<!---
### Overdubbing
`SemanticModels.Dubstep` provides functionality for manipulating models at execution time. Where `ModelTool` allows you
to manipulate models at the syntactic level, `Dubstep` allows you to manipulate their execution. This falls along
similar lines of static vs dynamic analysis.
You can modify a program's execution using `Cassette.overdub` and replace function calls with your own functions. For an example, see `test/transform/ode.jl`. Or you can use a new compiler pass if you need more control over the values that you want to manipulate.
-->
### Knowledge Graphs
MetaGraphs.jl is used to model the relationships between models and concepts in a knowledge graph.
There are a few different forms of knowledge graphs that can be extracted from codes.
1. The type graph: Vertices are types, edges are functions between types see `examples/agenttypes2.jl`.
2. Vertices are functions and variables, edges represent dataflow, function references variable or function calls function.
3. Conceptual knowledge graph from text, vertices are concepts edges are relations between concepts.
## Acknowledgements
This material is based upon work supported by the Defense Advanced Research Projects Agency (DARPA) under Agreement No. HR00111990008.

16
REQUIRE

@ -1,19 +1,5 @@
julia 1.0.0
CSV
Cassette
Distributions
Documenter
Catlab
LightGraphs
MetaGraphs
DataFrames
DataFramesMeta
GraphDataFrameBridge
GLM
JSON
Latexify
Unitful
Plots
MacroTools
Catlab
ModelingToolkit
Petri

717
SemanticModelsIn10Mins.ipynb
File diff suppressed because it is too large
View File

449
bin/extract.jl

@ -1,449 +0,0 @@
module Edges
#using Pkg
#Pkg.update()
using DataFrames
using GraphDataFrameBridge
using MetaGraphs
using CSV
using LightGraphs
using Random
using DataFramesMeta
using Colors
using Logging
using SemanticModels.Graphs
using SemanticModels.Extraction
export edgetype,
edges,
create_kg_from_code_edges,
create_kg_from_markdown_edges,
format_edges_dataframe,
assign_vertex_colors_by_key,
write_graph_to_dot_file
function edgetype(var, val::Expr)
if val.head == :call
return :output
elseif length(val.args) >=2 && typeof(val.args[2]) <: Expr && val.args[2].head == :tuple
@show val.args
return :structure
else
return :takes
end
end
edgetype(var, val::Symbol) = :destructure
edgetype(var, val) = :takes
edgetype(var::Symbol, val::Symbol) = :takes
edgetype(args...) = @show args
function edges(mc, subdef, scope)
@info("Making edges",scope=scope)
edg = Any[]
for ( var,val ) in mc.vc
@show var, val
typ = edgetype(var, val)
val = typ==:structure ? val.args[2] : val
e = (scope, typ, :var, var, :val, val)
push!(edg, e)
if typ == :output
e = (scope, typ, :var, var, :exp, val)
push!(edg, e)
end
if typ == :structure
e = (scope, typ, :var, var, :tuple, val.args[2])
push!(edg, e)
end
if typeof(val) <: Expr && val.head == :vect
push!(edg, (scope, :has, :value, var, :property, :collection))
end
if typeof(val) <: Expr && val.head == :call
push!(edg, (scope, :input, :func, val.args[1], :args, Symbol.(val.args[2:end])))
end
if typ == :destructure
@debug var.args
for lhs in var.args
push!(edg, (scope, :comp, :var, val, :var, lhs))
end
end
if typ == :structure
@debug var, val
for rhs in val.args
push!(edg, (scope, :comp, :var, var, :val, rhs))
end
end
end
for (funcname, smc) in subdef
@debug "Recursing"
# @show funcname, smc
subedges = edges(smc, [], "$scope.$funcname")
for e in subedges
push!(edg, e)
end
end
return edg
end
function preprocess_vertex_name(orig_str_rep)
#TODO this should work and be faster:
# replace(orig_str_rep, "^" => "exp", "*" => "star", "-" => "neg", "\"" => " ") # \" #this comment works around a bug in emacs julia formater
clean_str = replace(
replace(
replace(
replace(orig_str_rep, "^" => "exp"),
"*" => "star"),
"-" => "neg"),
"\"" => " ") # \" #this comment works around a bug in emacs julia formater
return clean_str
end
function format_edges_dataframe(code_edges, output_path::String)
edges_df = []
for e in code_edges
# Going from left to right, there are two edges we need to create (?)
# Edge one
src_1_vname = "$(e[1])"
src_1_vtype = "missing"
#dst_1_vname = preprocess_vertex_name(("\""*"\""*"$(e[4])"*"\""*"\""))
dst_1_vname = "$(e[4])"
#dst_1_vtype = e[3] # this is the data type; we'll use it once we have reconciliation rules in place
dst_1_vtype = "missing"
src_1_vhash = Graphs.gen_vertex_hash("$src_1_vname", "src_1_vtype")
dst_1_vhash = Graphs.gen_vertex_hash("$dst_1_vname", "dst_1_vtype")
edge_1_relation = e[2]
edge_1_description = e[2]
edge_1_value = e[6] # check this; unclear if we want to (only) store as metadata or actually create edge 2
edge_1_attrs = (src_vhash = "$src_1_vhash",
src_name = src_1_vname,
src_vtype = src_1_vtype,
dst_vhash = "$dst_1_vhash",
dst_name = dst_1_vname,
dst_vtype = dst_1_vtype,
edge_relation = edge_1_relation,
edge_description = edge_1_description,
value = "$edge_1_value"
)
push!(edges_df, edge_1_attrs)
# Edge two
src_2_vname = dst_1_vname
src_2_vtype = dst_1_vtype
#dst_2_vname = preprocess_vertex_name(("\""*"\""*"$(e[6])"*"\""*"\""))
dst_2_vname = "$(e[6])"
dst_2_vtype = "missing"
src_2_vhash = dst_1_vhash
dst_2_vhash = Graphs.gen_vertex_hash("$dst_2_vname", "dst_2_vtype")
edge_2_relation = string(e[5])
edge_2_description = string(e[5])
edge_2_value = string(e[6])
edge_2_attrs = (src_vhash = "$src_2_vhash",
src_name = src_2_vname,
src_vtype = src_2_vtype,
dst_vhash = "$dst_2_vhash",
dst_name = dst_2_vname,
dst_vtype = dst_2_vtype,
edge_relation = edge_2_relation,
edge_description = edge_2_description,
value = "$edge_2_value"
)
push!(edges_df, edge_2_attrs)
end
open(output_path, "w") do io
print(io, repr(Graphs.load_graph_data(edges_df)))
end
return edges_df
end
function create_kg_from_code_edges(code_edges)
G = MetaDiGraph()
G_prime = Graphs.insert_edges_from_jl(code_edges, G)
return G_prime
end
function create_kg_from_code_edges(code_edges, G::MetaDiGraph)
G_prime = Graphs.insert_edges_from_jl(code_edges, G)
return G_prime
end
function create_kg_from_markdown_edges(path)
G = Extraction.definitiongraph(path, Extraction.sequentialnamer())
# this is a hack for now..
# TODO: modify the markdown definitions.jl script to ensure vertex/edge props match the schema
for v in vertices(G)
v_hash = Graphs.gen_vertex_hash(get_prop(G, v, :name), "missing")
set_indexing_prop!(G, v, :v_hash, "$v_hash")
set_props!(G, v,
Dict(:v_name=>get_prop(G, v, :name),
:v_type=>"missing"))
end
for e in LightGraphs.edges(G)
src_vhash = Graphs.gen_vertex_hash(get_prop(G, e.src, :name), "missing")
dst_vhash = Graphs.gen_vertex_hash(get_prop(G, e.dst, :name), "missing")
set_props!(G, Edge(e.src, e.dst), Dict(:e_rel=>"verb",
:e_desc=>"Verb",
:e_value=>"is defined as",
:weight=>1))
end
return G
end
function create_kg_from_markdown_edges(path, extraction_rule="definition")
G = Extraction.definitiongraph(path, Extraction.sequentialnamer())
# vertex_df = DataFrame(vertex=String[],
# v_hash=String[],
# v_name=String[],
# v_type=String[],
# v_text=String[])
# edges_df = DataFrame(src_vhash = String[],
# src_name = String[],
# src_vtype = String[],
# dst_vhash = String[],
# dst_name = String[],
# dst_vtype = String[],
# edge_relation = String[],
# edge_description = String[],
# value = String[])
# this is a hack for now..
# TODO: modify the markdown definitions.jl script to ensure vertex/edge props match the schema
for v in vertices(G)
v_hash = Graphs.gen_vertex_hash(get_prop(G, v, :name), "concept")
set_indexing_prop!(G, v, :v_hash, "$v_hash")
set_props!(G, v,
Dict(:v_name=>get_prop(G, v, :name),
:v_type=>"concept",
:v_text=>length(get_prop(G, v, :text))==0 ? "no_text" : get_prop(G, v, :text)))
# push!(vertex_df, (vertex="$v", v_hash="$v_hash", v_name=get_prop(G, v, :name), v_type="concept", v_text=length(get_prop(G, v, :text))==0 ? "no_text" : get_prop(G, v, :text)))
end
for e in LightGraphs.edges(G)
src_vhash = Graphs.gen_vertex_hash(get_prop(G, e.src, :name), "concept")
dst_vhash = Graphs.gen_vertex_hash(get_prop(G, e.dst, :name), "concept")
set_props!(G, Edge(e.src, e.dst), Dict(:e_rel=>"verb",
:e_desc=>"Verb",
:e_value=>"is defined as",
:weight=>1))
# push!(edges_df, (src_vhash="$src_vhash", src_name = get_prop(G, e.src, :name), src_vtype="concept",
# dst_vhash="$dst_vhash", dst_name = get_prop(G, e.dst, :name), dst_vtype = "concept",
# edge_relation= "verb", edge_description="Verb", value="is defined as"))
end
# for debugging
#CSV.write("../examples/epicookbook/data/markdown_vertices.csv",vertex_df)
#CSV.write("../examples/epicookbook/data/markdown_edges.csv",edges_df)
return G
end
""" assign_vertex_colors_by_key(G::MetaDiGraph, color_by::Symbol)
Takes as input an existing MetaDiGraph, and a group_by, which is assumed to correspond to an existing vertex prop.
Groups the vertices in G by the group_by field, computes the number of unique colors needed, and generates a hash table, with key equal to the vertex hash, and value equal to the assigned color.
see also: [`assign_edge_styles_by_key`](@ref), [1write_graph_to_dot_file1](@ref)
"""
function assign_vertex_colors_by_key(G::MetaDiGraph, group_by::Symbol)
# element 1 = white; element 2 = black; for sake of making graph easier to read, add +2 buffer and start indexing colors at 3
cols = distinguishable_colors(length(unique(get_prop(G, v, group_by) for v in vertices(G)))+2, [RGB(1,1,1)])[3:end]
color_type_lookup = Dict()
for (i, v_type) in enumerate(unique([get_prop(G, v, group_by) for v in vertices(G)]))
color_type_lookup[v_type] = "#$(hex(cols[i]))"
end
vertex_color_lookup = Dict()
for v in vertices(G)
vertex_color_lookup[get_prop(G, v, :v_hash)] = color_type_lookup[get_prop(G, v,group_by)]
end
return vertex_color_lookup
end
""" assign_edge_styles_by_key(G::MetaDiGraph, group_by::Symbol)
Takes as input an existing MetaDiGraph, and a group_by field, which is assumed to correspond to an existing edge prop.
Groups the directed edges in G by the group_by field, computes the number of unique edge styles needed, and generates a hash table, with key equal to the edge id, and value equal to the assigned (line) style.
see also: [`assign_edge_styles_by_key`](@ref), [1write_graph_to_dot_file1](@ref)
"""
function assign_edge_style_by_key(G::MetaDiGraph, groupy_by::Symbol)
# TODO; need to figure out how to generate arbitrary styles, or put a cap on number based on schema
end
# TODO: fix syntax and puncutuation errors so the graph can be loaded from a dot file
function write_graph_to_dot_file(G::MetaDiGraph, output_path::String, graph_name::String, v_color_lookup)
head = "digraph " * graph_name * " {"
open(output_path, "w") do io
println(io, head)
for v in vertices(G)
v_color = v_color_lookup[get_prop(G, v, :v_hash)]
vname = preprocess_vertex_name(get_prop(G, v, :v_name))
println(io, string("$v" * " [color=" * "\"" * "$v_color" * "\"" * " ," * " label=" * "\"" * "$vname" * "\"" * "];"))
end
for e in LightGraphs.edges(G)
src_vname = preprocess_vertex_name(get_prop(G, e.src, :v_name))
dst_vname = preprocess_vertex_name(get_prop(G, e.dst, :v_name))
e_rel = get_prop(G, e, :e_rel)
#e_value = get_prop(G, e, :e_value)
println(io, string("$(e.src)" * " -> " * "$(e.dst)" * " [label=" * "\"" * "$(e_rel)" * "\"" * "];"))
end
println(io, "}")
end
# Example dot file
# digraph graphname {
# // The label attribute can be used to change the label of a node
# a [label="Foo"];
# // Here, the node shape is changed.
# b [shape=box];
# // These edges both have different line properties
# a -- b -- c [color=blue];
# b -- d [style=dotted];
# }
end
end
# command-line usage
# julia -i --project ../bin/extract.jl ../examples/epicookbook/src/ScalingModel.jl ../examples/epicookbook/src/SEIRmodel.jl
using SemanticModels.Parsers
using SemanticModels.Graphs
using SemanticModels.Extraction
@debug "Done Loading Package"
using DataFrames
using MetaGraphs
using LightGraphs
if length(ARGS) < 1
error("You must provide a file path to a .jl file", args=ARGS)
end
mdown_path = "../examples/epicookbook/epireceipes_Automates_GTRI_ASKE_2rules_output/json"
function markdowngraph(mdown_path)
try
return Edges.create_kg_from_markdown_edges(mdown_path, "definition")
catch
@warn "Failed to read markdown" path=mdown_path
return MetaDiGraph()
end
end
G_markdown = markdowngraph(mdown_path)
@info("Graph created from markdown has v vertices and e edges.", v=nv(G_markdown), e=ne(G_markdown))
num_files = length(ARGS)
global G_temp = MetaDiGraph()
for i in 1:num_files
output_path = "../examples/epicookbook/data/edges_from_code_$i.jl"
path = ARGS[i]
@info "Parsing julia script" file=path
expr = parsefile(path)
mc = defs(expr.args[3].args)
@info "script uses modules" modules=mc.modc
@info "script defines functions" funcs=mc.fc.defs
@info "script defines glvariables" funcs=mc.vc
subdefs = recurse(mc)
@info "local scope definitions" subdefs=subdefs
for func in subdefs
funcname = func[1]
mc = func[2]
@info "$funcname uses modules" modules=mc.modc
@info "$funcname defines functions" funcs=mc.fc.defs
@info "$funcname defines variables" funcs=mc.vc
end
edg = Edges.edges(mc, subdefs, expr.args[2])
@info("Edges found", path=path)
#for e in edg
#println(e)
#end
code_edges_df = Edges.format_edges_dataframe(edg, output_path)
if i == 1
# We only need to ingest the markdown info once.
G_code = Edges.create_kg_from_code_edges(output_path, G_markdown)
else
G_code = Edges.create_kg_from_code_edges(output_path, G_temp)
end
@info("Code graph $i has v vertices and e edges.", v=nv(G_code), e=ne(G_code))
global vcolors = Edges.assign_vertex_colors_by_key(G_code, :v_type)
global G_temp = copy(G_code)
end
@info("All markdown and code files have been parsed; writing final knowledge graph to dot file")
dot_file_path = "../examples/epicookbook/data/dot_file_ex1.dot"
Edges.write_graph_to_dot_file(G_temp, dot_file_path, "G_code_and_markdown", vcolors)
# Generate svg file
run(`dot -Tsvg -O $dot_file_path`)

109
bin/extractvars.jl

@ -1,109 +0,0 @@
#!/usr/bin/env julia
# extractvars.jl is a script to print out the variables contained in a julia program
# Example Usage:
#
# julia ../bin/extractvars.jl epicookbook/src/SIRModel.jl epicookbook/src/ScalingModel.jl
#
# Example Output:
# loaded
# epicookbook/src/SIRModel.jl, "sir_sol"
# epicookbook/src/SIRModel.jl, "tspan"
# epicookbook/src/SIRModel.jl, "du[2]"
# epicookbook/src/SIRModel.jl, "(b, g)"
# epicookbook/src/SIRModel.jl, "S"
# epicookbook/src/SIRModel.jl, "pram"
# epicookbook/src/SIRModel.jl, "init"
# epicookbook/src/SIRModel.jl, "sir_prob2"
# epicookbook/src/SIRModel.jl, "(S, I, R)"
# epicookbook/src/SIRModel.jl, "du[3]"
# epicookbook/src/SIRModel.jl, "parms"
# epicookbook/src/SIRModel.jl, "ฮฒ"
# epicookbook/src/SIRModel.jl, "ฮณ"
# epicookbook/src/SIRModel.jl, "I"
# epicookbook/src/SIRModel.jl, "du[1]"
# epicookbook/src/SIRModel.jl, "sir_prob"
# epicookbook/src/ScalingModel.jl, "ฮฑ"
# epicookbook/src/ScalingModel.jl, "sir_sol"
# epicookbook/src/ScalingModel.jl, "tspan"
# epicookbook/src/ScalingModel.jl, "ฮฒs[:, i]"
# epicookbook/src/ScalingModel.jl, "w"
# epicookbook/src/ScalingModel.jl, "ฮผ"
# epicookbook/src/ScalingModel.jl, "K"
# epicookbook/src/ScalingModel.jl, "init"
# epicookbook/src/ScalingModel.jl, "(ฮฒ, r, ฮผ, K, ฮฑ)"
# epicookbook/src/ScalingModel.jl, "dS"
# epicookbook/src/ScalingModel.jl, "ws"
# epicookbook/src/ScalingModel.jl, "(S, I)"
# epicookbook/src/ScalingModel.jl, "parms"
# epicookbook/src/ScalingModel.jl, "du"
# epicookbook/src/ScalingModel.jl, "r"
# epicookbook/src/ScalingModel.jl, "dI"
# epicookbook/src/ScalingModel.jl, "ฮฒ"
# epicookbook/src/ScalingModel.jl, "m"
# epicookbook/src/ScalingModel.jl, "ฮฒs"
# epicookbook/src/ScalingModel.jl, "sir_prob"
# epicookbook/src/ScalingModel.jl, "i"
try
using SemanticModels
catch
import Pkg;
Pkg.add("SemanticModels")
end
using SemanticModels.Parsers
import SemanticModels.Parsers.walk
""" findassigns(expr::Expr)
findassign walks the AST of `expr` to find the assignments to variables.
This function returns a reference to the original expression so that you can modify it inplace
and is intended to help users rewrite expressions for generating new models.
See also: [`findassig