iam-git / WellMet (public) (License: MIT) (since 2021-08-31) (hash sha1)
WellMet is pure Python framework for spatial structural reliability analysis. Or, more specifically, for "failure probability estimation and detection of failure surfaces by adaptive sequential decomposition of the design domain".
List of commits:
Subject Hash Author Date (UTC)
estimation: Voronoi_2_point estimation node_pf_coloring fixed 35fc3d622fe9d5cfa519f1484862108b4003688b Олёш 2020-07-27 10:35:06
candybox: pandas index mess fix, __getattr__ will return numpyed array 10e7c39a69bd105780924319bbafb58602251a9d Олёш 2020-07-27 10:31:43
qt_plot: SimplexEstimationWidget: nodes size increased, VoronoiGraph: redraw item added 8176d66545bfcfa6ee4ea64cbdf3ffd9bbf5ca11 Олёш 2020-07-24 07:01:13
qt_plot: grafy aspoň něco kreslí c2c628d6edf48eb607e44902fa14022c97832cb1 Олёш 2020-07-22 14:40:30
qt_plot: с тех виӝетů на пока хватит 55ee0ccf1d72d6515a66b673c809b9548a2b1ecb Олёш 2020-07-22 09:01:26
qt_plot: WIP 65694b44be8b8f1da9d5c385d37964e65c3824e7 Олёш 2020-07-22 07:21:30
qt_plot: WIP 185d6b48000c43da723ea27055fa5d9540ae2a2a Олёш 2020-07-21 21:45:00
qt_plot: WIP 72bd348eeb85a1ee52d38978a37369e412e959f5 Олёш 2020-07-21 13:24:26
qt_plot: WIP 52ef45375a5851c74111b33223eab093c1e563c2 Олёш 2020-07-20 15:10:06
estimation: simplex_estimation fix. Less wrong) 6fe77618a0654cf107f48f6441738670355dc0a7 Олёш 2020-07-20 06:25:41
qt_plot: WIP ebbad200e0692e667b828a4c2bcc70229fa797e6 Олёш 2020-07-19 11:20:05
estimation: fixes on simplex_estimation, but it still lies. 8877269530c69b57fd24ed67a4b76a350ebf5da1 Олёш 2020-07-19 11:18:39
qt_plot: WIP d654b3d598a3d8675fb2751fb6f3d5d99bb60043 Олёш 2020-07-17 02:51:50
estimation: simplex_estimation is ready (ale zatím kecá) b804405fdd8ac94116038178a7d401c47830eb14 Олёш 2020-07-15 02:41:44
estimation: WIP 60686740c1f3685f3ceb2bcc478d177407ca3381 Олёш 2020-07-14 15:40:52
estimation: WIP 726058435dcf653f816390e3fc996747fff07908 Олёш 2020-07-14 02:41:20
estimation: simple tesselation added 344f275e586e42c4946f6d4f18a2503ec5f28869 Олёш 2020-07-11 22:25:08
estimation: 2 point Voronoi polishing 18f6563c0b209f4c43c4d19a9dac5f4ff74a790a Олёш 2020-07-11 05:30:10
f_models & whitebox: object generation (creation) fix 3b2e17e86e14f0eaea0e3264e8d6a4f0930505f5 Олёш 2020-07-10 20:54:23
estimation: WIP, Voronoi_2_point worked 364c89027063f0bd3a1af80d7d53db059184dd68 Олёш 2020-07-10 02:51:33
Commit 35fc3d622fe9d5cfa519f1484862108b4003688b - estimation: Voronoi_2_point estimation node_pf_coloring fixed
Author: Олёш
Author date (UTC): 2020-07-27 10:35
Committer name: Олёш
Committer date (UTC): 2020-07-27 10:35
Parent(s): 10e7c39a69bd105780924319bbafb58602251a9d
Signer:
Signing key:
Signing status: N
Tree: 83f6a365a6d27588a6148a68c97b8ead9583f2ea
File Lines added Lines deleted
estimation.py 9 7
File estimation.py changed (mode: 100644) (index b383753..dc60aa7)
... ... def Voronoi_2_point_estimation(sample_box, model_space='Rn', sampling_space=None
316 316 if gradient is None: if gradient is None:
317 317 # indexy ii nás moc nezajimajou # indexy ii nás moc nezajimajou
318 318 # vzdalenosti snad byjsme zvladli použit? # vzdalenosti snad byjsme zvladli použit?
319 # dd1 jsou vzdalenosti tečiček do centra Voroneho buňky
319 320 dd1 = dd[Vor_mask] dd1 = dd[Vor_mask]
320 321
321 322 dd2, ii2 = tree.query(h_plan_model_ma, k=[2], p=p_norm) dd2, ii2 = tree.query(h_plan_model_ma, k=[2], p=p_norm)
 
... ... def Voronoi_2_point_estimation(sample_box, model_space='Rn', sampling_space=None
324 325
325 326 # tahle hračka s indexy je pro numpy poměrně drahá # tahle hračka s indexy je pro numpy poměrně drahá
326 327 failsii_2 = failsi[ii2] failsii_2 = failsi[ii2]
327
328 328 # jeden vzorek (včetně hustoty PDF[i]) je nám vždy znám # jeden vzorek (včetně hustoty PDF[i]) je nám vždy znám
329 329 # porucha # porucha
330 330 if failsi[i]: if failsi[i]:
331 331 points_1 = PDF[i] * dd2 points_1 = PDF[i] * dd2
332 332 node_pf_estimations = points_1 / (points_1 + PDF[ii2] * dd1) node_pf_estimations = points_1 / (points_1 + PDF[ii2] * dd1)
333 node_pf_estimations = np.where(failsii_2,1, node_pf_estimations)
333 334 node_pf_pure_estimations = dd2 / (dd1 + dd2) node_pf_pure_estimations = dd2 / (dd1 + dd2)
335 node_pf_pure_estimations = np.where(failsii_2,1, node_pf_pure_estimations)
334 336
335 337 cell_stats['Voronoi_2_point_upper_bound'] = cell_stats['cell_probability'] cell_stats['Voronoi_2_point_upper_bound'] = cell_stats['cell_probability']
336 cell_stats['Voronoi_2_point_failure_rate'] = np.sum(weights_sim*np.where(failsii_2,1, node_pf_estimations)) / nis
337 cell_stats['Voronoi_2_point_pure_failure_rate'] = np.sum(weights_sim*np.where(failsii_2,1, node_pf_pure_estimations)) / nis
338 cell_stats['Voronoi_2_point_failure_rate'] = np.sum(weights_sim * node_pf_estimations) / nis
339 cell_stats['Voronoi_2_point_pure_failure_rate'] = np.sum(weights_sim * node_pf_pure_estimations) / nis
338 340 cell_stats['Voronoi_2_point_lower_bound'] = np.sum(weights_sim[failsii_2]) / nis cell_stats['Voronoi_2_point_lower_bound'] = np.sum(weights_sim[failsii_2]) / nis
339 341 cell_stats['Voronoi_failure_rate'] = cell_stats['cell_probability'] cell_stats['Voronoi_failure_rate'] = cell_stats['cell_probability']
340 nodes=CandyBox(h_plan[Vor_mask], w=h_plan.w[Vor_mask], node_pf_estimations=node_pf_estimations,\
342 nodes=CandyBox(h_plan.sampling_plan[Vor_mask], w=h_plan.w[Vor_mask], node_pf_estimations=node_pf_estimations,\
341 343 node_pf_pure_estimations=node_pf_pure_estimations) node_pf_pure_estimations=node_pf_pure_estimations)
342 344
343 345 # neporucha # neporucha
 
... ... def Voronoi_2_point_estimation(sample_box, model_space='Rn', sampling_space=None
355 357 cell_stats['Voronoi_2_point_pure_failure_rate'] = np.sum(weights_sim[failsii_2] * node_pf_pure_estimations) / nis cell_stats['Voronoi_2_point_pure_failure_rate'] = np.sum(weights_sim[failsii_2] * node_pf_pure_estimations) / nis
356 358 cell_stats['Voronoi_2_point_lower_bound'] = 0 cell_stats['Voronoi_2_point_lower_bound'] = 0
357 359 cell_stats['Voronoi_failure_rate'] = 0 cell_stats['Voronoi_failure_rate'] = 0
358 nodes=CandyBox(h_plan[Vor_mask][failsii_2], w=weights_sim[failsii_2], node_pf_estimations=node_pf_estimations,\
360 nodes=CandyBox(h_plan.sampling_plan[Vor_mask][failsii_2], w=weights_sim[failsii_2], node_pf_estimations=node_pf_estimations,\
359 361 node_pf_pure_estimations=node_pf_pure_estimations) node_pf_pure_estimations=node_pf_pure_estimations)
360 362
361 363 # take something with corresponding length # take something with corresponding length
362 364 zeros = np.zeros(len(weights_sim) - len(dd2)) zeros = np.zeros(len(weights_sim) - len(dd2))
363 365 # add remaining nodes # add remaining nodes
364 nodes.add_sample(CandyBox(h_plan[Vor_mask][~failsii_2], w=weights_sim[~failsii_2], node_pf_estimations=zeros,\
366 nodes.add_sample(CandyBox(h_plan.sampling_plan[Vor_mask][~failsii_2], w=weights_sim[~failsii_2], node_pf_estimations=zeros,\
365 367 node_pf_pure_estimations=zeros)) node_pf_pure_estimations=zeros))
366 368
367 369
 
... ... def Voronoi_2_point_estimation(sample_box, model_space='Rn', sampling_space=None
415 417 cell_stats['Voronoi_2_point_lower_bound'] = np.sum(h_plan.w[Vor_mask]*np.floor(node_pf_estimations)) / nis cell_stats['Voronoi_2_point_lower_bound'] = np.sum(h_plan.w[Vor_mask]*np.floor(node_pf_estimations)) / nis
416 418 cell_stats['Voronoi_failure_rate'] = np.sum(h_plan.w[Vor_mask]*node_failsi) / nis cell_stats['Voronoi_failure_rate'] = np.sum(h_plan.w[Vor_mask]*node_failsi) / nis
417 419
418 nodes=CandyBox(h_plan[Vor_mask], w=h_plan.w[Vor_mask], node_pf_estimations=node_pf_estimations,\
420 nodes=CandyBox(h_plan.sampling_plan[Vor_mask], w=h_plan.w[Vor_mask], node_pf_estimations=node_pf_estimations,\
419 421 node_pf_pure_estimations=node_pf_pure_estimations, node_failsi=node_failsi) node_pf_pure_estimations=node_pf_pure_estimations, node_failsi=node_failsi)
420 422
421 423
Hints:
Before first commit, do not forget to setup your git environment:
git config --global user.name "your_name_here"
git config --global user.email "your@email_here"

Clone this repository using HTTP(S):
git clone https://rocketgit.com/user/iam-git/WellMet

Clone this repository using ssh (do not forget to upload a key first):
git clone ssh://rocketgit@ssh.rocketgit.com/user/iam-git/WellMet

Clone this repository using git:
git clone git://git.rocketgit.com/user/iam-git/WellMet

You are allowed to anonymously push to this repository.
This means that your pushed commits will automatically be transformed into a merge request:
... clone the repository ...
... make some changes and some commits ...
git push origin main