- Introduction to Python
- Getting started with Python and the IPython notebook
- Functions are first class objects
- Data science is OSEMN
- Working with text
- Preprocessing text data
- Working with structured data
- Using SQLite3
- Using HDF5
- Using numpy
- Using Pandas
- Computational problems in statistics
- Computer numbers and mathematics
- Algorithmic complexity
- Linear Algebra and Linear Systems
- Linear Algebra and Matrix Decompositions
- Change of Basis
- Optimization and Non-linear Methods
- Practical Optimizatio Routines
- Finding roots
- Optimization Primer
- Using scipy.optimize
- Gradient deescent
- Newton’s method and variants
- Constrained optimization
- Curve fitting
- Finding paraemeters for ODE models
- Optimization of graph node placement
- Optimization of standard statistical models
- Fitting ODEs with the Levenberg–Marquardt algorithm
- 1D example
- 2D example
- Algorithms for Optimization and Root Finding for Multivariate Problems
- Expectation Maximizatio (EM) Algorithm
- Monte Carlo Methods
- Resampling methods
- Resampling
- Simulations
- Setting the random seed
- Sampling with and without replacement
- Calculation of Cook’s distance
- Permutation resampling
- Design of simulation experiments
- Example: Simulations to estimate power
- Check with R
- Estimating the CDF
- Estimating the PDF
- Kernel density estimation
- Multivariate kerndel density estimation
- Markov Chain Monte Carlo (MCMC)
- Using PyMC2
- Using PyMC3
- Using PyStan
- C Crash Course
- Code Optimization
- Using C code in Python
- Using functions from various compiled languages in Python
- Julia and Python
- Converting Python Code to C for speed
- Optimization bake-off
- Writing Parallel Code
- Massively parallel programming with GPUs
- Writing CUDA in C
- Distributed computing for Big Data
- Hadoop MapReduce on AWS EMR with mrjob
- Spark on a local mahcine using 4 nodes
- Modules and Packaging
- Tour of the Jupyter (IPython3) notebook
- Polyglot programming
- What you should know and learn more about
- Wrapping R libraries with Rpy
文章来源于网络收集而来,版权归原创者所有,如有侵权请及时联系!
1D example
from lmfit import minimize, Parameters, Parameter, report_fit from scipy.integrate import odeint def f(xs, t, ps): """Receptor synthesis-internalization model.""" try: a = ps['a'].value b = ps['b'].value except: a, b = ps x = xs return a - b*x def g(t, x0, ps): """ Solution to the ODE x'(t) = f(t,x,k) with initial condition x(0) = x0 """ x = odeint(f, x0, t, args=(ps,)) return x def residual(ps, ts, data): x0 = ps['x0'].value model = g(ts, x0, ps) return (model - data).ravel() a = 2.0 b = 0.5 true_params = [a, b] x0 = 10.0 t = np.linspace(0, 10, 10) data = g(t, x0, true_params) data += np.random.normal(size=data.shape) # set parameters incluing bounds params = Parameters() params.add('x0', value=float(data[0]), min=0, max=100) params.add('a', value= 1.0, min=0, max=10) params.add('b', value= 1.0, min=0, max=10) # fit model and find predicted values result = minimize(residual, params, args=(t, data), method='leastsq') final = data + result.residual.reshape(data.shape) # plot data and fitted curves plt.plot(t, data, 'o') plt.plot(t, final, '--', linewidth=2, c='blue'); # display fitted statistics report_fit(result)
[[Fit Statistics]] # function evals = 29 # data points = 10 # variables = 3 chi-square = 10.080 reduced chi-square = 1.440 [[Variables]] x0: 10.1714231 +/- 1.156777 (11.37%) (init= 10.54454) a: 2.56428320 +/- 1.700899 (66.33%) (init= 1) b: 0.52952597 +/- 0.296358 (55.97%) (init= 1) [[Correlations]] (unreported correlations are < 0.100) C(a, b) = 0.989 C(x0, b) = 0.453 C(x0, a) = 0.416
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论