-
-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Alpha 3 release ready #40
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
adds shillelagh to attempt implementing a sql layer on top of the library and see if it helps with #9 todo items for me are: - see if pyndatic can help define sqlalchemy schema - as per #31 i want to have a sqlclahemy dialect - be able to sync two sources one being the rest endpoints using sqlalchemy
adds an endpoint for division cli endpoint in anticipation of the fact that we want start adding some mutative endpoints to the api client and the cli
there are various features on gallagher's documentaiton listed as coming soon, these should be identified by the api client as such and not unavailable or missing
for consistency i've renamed the detail response to summary and will use the detail object to represent the detail endpoints there are likely to be more details around this as the api expands, documentation should reflect this
aiming to complete the api design documentaiton as i complete the implementation hence refactoring the docs to accomodate for the tui and cli (these will be more use focused docs than developer). keeping the structure simple for now and will divide it into folders if required and the items get bigger
thanks to this repository https://github.com/mkdocs/catalog i found mkdocs-typer by @bruce-szalwinski https://github.com/bruce-szalwinski/mkdocs-typer this greatly simplifies documenting the command line inteface by automatically generating the reference from the docstrings
adding a task endpoint where you can provide a partial url for the cc and it will add the appropriate headers and prepend it to the aussie rest api proxy gateway. note: this is purely for debugging purposes during development and hence the prefix of `debug`
making sure that response classes have their own parent and are treated separately than just an object recieved from the servers, responses are of two types ones with next, back and updates and ones without, there are two classes that represent that. the parent response classes will know how to follow links through, this will be outlined as part of the implementation
i noticed that at various places i have given into the lure of gpt and some of the sentences did not read well or did not convey my intent properly, this rewirtes much of the content properly
personal data fields are dynamically generated based on the definitions found on the command centre instance, this is a special case in terms of parsing responses from the server as the keys of these response are not dyanamically populated, this commit moves to using a enumeration to ensure that the values for the `type` is one of what the server should send refs #1
brings the cardholder detail model closer to completion with access groups being parsed as per the definition refs #1
just as we do with the discovery message, we cache the pdf fields available on the server onto the capabilities constant, this can be used to parse models that use the personal data fields refs #1
…rsers pdfs are dynamically populated from the data and schema that is sent for the customer detail object, this is a first attempt to see if we can parse the pdf items using mode_validate. WARNING: nothing here may work and we may end up changing the entire direction of this implementation refs #1
i've just realised that the customer detail response sends the relevant pdf fields as part of the personalDataDefinitions field, if we can parse this and make sense of it then the keys to the dynamic fields will exists in the response and we should not read this from the cache. moreover it would make sense that the personDataDefinitions is the set of keys that are part of the customer response i.e not every field is available or is in use all the time. changing tactic here, and going to try and parse the personalDatadefinitions first and then populate the keys from there refs #1
personalDataDefinition field parsing working given refactoring of the dictionary or sorts and allowing pyndatic to make sense of what the server sends back to the api client the parser fails if the value key is missing from a payload which seems to be the case in some items in the sample data for the command centre refs #1
personal_data_definitions has the validated fields accessible as pyndatic objects this makes the same values accessible as pythonic fields so the developers code can look a little more idiomatic i.e @Email Address can now be accessed as .pdf.email_address refs #1
Personal Data Fields being parsed ahead of exposing PDF aliases
the value in a cardholder personal data field is at times a href changes the validation to accept str or href and removes the optional flag leaving a comment on summary/cardholder.py:105 where the ble_facility_id is int for mobile credentials, this was tested against our test command centre instance, the docs still highlight this as a reserved field. leaving a todo to see if we should come back to this for validationm
fiod2 was missing from our list of credentials, also turns Enums to be uppercase constants in accordance with PEP8 https://docs.python.org/3/howto/enum.html
as we push to get alpha3 out, i am putting a massive effort on all the issues that have the documentation tag on them to get the project to completion
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
📢 Preamble
This release solidifies the vision of this API client. Our current vision is to ship:
This release provides a near complete version of the Python API and the related tools, creating a foundation and making is ready for release.
✨ New
Typer
based command line interface, thanks to @tiangolo for maintaining it💅 Improvements
task
endpoint for debugging the Gallagher REST endpoints, for example usetask debug:get -- alarms
to trigger aget
request. This will usehttpie
to execute the request and append necessary headers and parameterscli
documentation.testsuite
performance by transforming network requests intofixtures
for examples we can fetch alarms and then perform multiple validation operations on the data:🛠️ Fixes