ParallelRoutineDispatchTransformation#299
Conversation
|
COSSEVIN Erwan seems not to be a GitHub user. You need a GitHub account to be able to sign the CLA. If you have already a GitHub account, please add the email address used for this commit to your account. You have signed the CLA already but the status is still pending? Let us recheck it. |
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## nabr-parallel-loop-driver-trafo #299 +/- ##
==================================================================
Coverage ? 95.12%
==================================================================
Files ? 136
Lines ? 29787
Branches ? 0
==================================================================
Hits ? 28336
Misses ? 1451
Partials ? 0
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
d1ece0e to
348e1bb
Compare
reuterbal
left a comment
There was a problem hiding this comment.
This looks very promising! A few comments just about things I noticed. More offline now...
|
|
||
| dr_hook_calls = self.create_dr_hook_calls( | ||
| routine, pragma_attrs['name'], | ||
| routine, routine.name+":"+pragma_attrs['name'], |
There was a problem hiding this comment.
Just a small remark for a more pythonic string composition:
| routine, routine.name+":"+pragma_attrs['name'], | |
| routine, f'{routine.name}:{pragma_attrs["name"]}', |
| "KPROMA", "YDDIM%NPROMA", "NPROMA" | ||
| ] | ||
| #TODO : do smthg for opening field_index.pkl | ||
| with open(os.getcwd()+"/transformations/transformations/field_index.pkl", 'rb') as fp: |
There was a problem hiding this comment.
Just as a remark for later: this map should be loaded/compiled/supplied in the head script, and then handed to the Transformation as an argument.
| self.new_calls = [] | ||
| # IF (ASSOCIATED (YL_ZA)) CALL FIELD_DELETE (YL_ZA) | ||
| self.delete_calls = [] | ||
| # map[name] = [field_ptr, ptr] | ||
| # where : | ||
| # field_ptr : pointer on field api object | ||
| # ptr : pointer to the data | ||
| self.routine_map_temp = {} | ||
| self.routine_map_derived = {} |
There was a problem hiding this comment.
Again a remark for later, and I know that this doesn't apply in your use case but would be a requirement for integration into the main branch. So, I'm pointing this out here:
Transformations are intended to be written in a way that allows to instantiate them once and then apply them to an arbitrary number of subroutines. This information is local to one routine but currently it lives on the transformation object. Later, we should consider making these local variables in the transform_subroutine and, where necessary, have methods update or return them.
| ptr_var=() | ||
| for value in self.routine_map_derived.values(): | ||
| dcl = ir.VariableDeclaration( | ||
| symbols=(value[1],) | ||
| ) | ||
| ptr_var += (dcl,) | ||
| routine.spec.append(ptr_var) No newline at end of file |
There was a problem hiding this comment.
I would recommend using the following pattern to add the relevant declarations:
routine.variables += tuple(v[1] for v in self.routine_map_derived.values())
| raise NotImplementedError("This type isn't implemented yet") | ||
|
|
||
| # Creating the pointer on the field api object : YL%FA, YL%F_A... | ||
| if routine.variable_map[var.name_parts[0]].type.dtype.name=="MF_PHYS_SURF_TYPE": |
There was a problem hiding this comment.
Just as a remark: Using routine.variable_map repeatedly can be costly, because it compiles the map everytime from the variable declarations. In a routine like this, it's recommended to cache this at the top of the routine as variable_map = routine.variable_map and then use that local copy.
d1cf092 to
04c3e75
Compare
…t on the field api object
04c3e75 to
180695a
Compare
58c95f4 to
e77e88d
Compare
8cbccfe to
f7b87ff
Compare
…region in the routine
…ath to open the dictionnaries and then opening them in the transformation.
efcfe84 to
f544743
Compare
|
Hi @ecossevin, do I remember correctly that you are planning to move your transformations into a separate repository? In which case I think this PR could be closed? |
|
Hi @reuterbal , I don't remember saying that. It's as you want. It can stay here or be moved out. |
No description provided.