From 5ba4c7e7700daee169a64093b5c101aeb52b9d20 Mon Sep 17 00:00:00 2001 From: Chris Lunsford Date: Mon, 17 Oct 2016 12:47:41 -0400 Subject: [PATCH 1/7] Convert README to reStructuredText and Revise #15 Initial README.rst --- README.rst | 147 +++++++++++++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 147 insertions(+) create mode 100644 README.rst diff --git a/README.rst b/README.rst new file mode 100644 index 0000000..eee7c55 --- /dev/null +++ b/README.rst @@ -0,0 +1,147 @@ +============= +ciscosparkapi +============= + +---------------------------------------------------------------------------- +Simple, lightweight and scalable Python API wrapper for the Cisco Spark APIs +---------------------------------------------------------------------------- + +Sure, working with the Cisco Spark APIs is easy (see `devloper.ciscospark.com`_). They are RESTful, simply and naturally structured, require only a simple Access Token to authenticate, and the data is elegantly represented in intuitive JSON. What could be easier? + +.. code-block:: python + import requests + + URL = 'https://api.ciscospark.com/v1/messages' + ACCESS_TOKEN = '' + ROOM_ID = '' + MESSAGE_TEXT = '' + + headers = {'Authorization': 'Bearer ' + ACCESS_TOKEN, + 'Content-type': 'application/json;charset=utf-8'} + post_data = {'roomId': ROOM_ID, + 'text': MESSAGE_TEXT} + response = requests.post(URL, json=post_data, headers=headers) + if response.status_code == 200: + # Great your message was posted! + message_id = response.json['id'] + message_text = response.json['id'] + print("New message created, with ID:", message_id) + print(message_text) + else: + # Oops something went wrong... Better do something about it. + print(response.status_code, response.text) + +Like I said, easy. However, it is rather repetitive... + +- You have to setup this same environment every time you want do something with just one of the API calls. +- ...and this was just posting a text message to Spark, what about when you want to do something like uploading a file? +- What if you your application or use case needs to make calls to several of the Spark APIs? +- This can turn into a lot of boiler plate code! +- Sure, you can consolidate the repetitious code into functions, and end up building your own library of functions (and many have done this!), or you can build upon the shoulders of those who have gone before you... + +Enter **ciscosparkapi**, a simple API wrapper that wraps the RESTful Spark API calls and returned JSON objects within native Python objects and function calls. + +The above Python code can be consolidated to the following: + +.. code-block:: python + from ciscosparkapi import CiscoSparkAPI + + api = CiscoSparkAPI() # Creates a new API 'connection object' + try: + message = api.messages.create('', text='') # Creates a new message and raises an exception if something goes wrong. + print("New message created, with ID:", message.id) + print(message.text) + except SparkApiError as e: # Handles the exception, if something goes wrong + print(e) + +The ciscosparkapi package handles all of this for you: + ++ Your Spark access token can automatically be retrieved from a ``SPARK_ACCESS_TOKEN`` environment variable. *You don't have to provide it every time you create a new API connection object.* ++ *You don't have to remember the API endpoint URLs or JSON parameters.* They have been wrapped in native Python methods. ++ If your Python IDE supports **auto-completion** (like PyCharm_), *you can simply navigate the available methods and object attributes right within your IDE*. ++ The JSON objects returned from the Cisco Spark cloud are modeled as native Python objects, which also support auto-completion and native attribute access. *You don't have to think about parsing the JSON objects or working with dictionaries or creating lots of variables to hold the returned object's attributes. You can simply interact with the returned object as a native Python object.* ++ When requesting 'lists of objects' from Spark, like enumerating the messages in a room or a list of rooms of which you are a member, you don't have think about handling and requesting pages_ of responses. These are simply and efficiently abstracted and requested as needed - as you access the returned objects. *Other than a slight delay as additional objects are requested from the Spark cloud, you won't have to deal with or think about pages of responses.* + +...which lets you do powerful things simply: + +.. code-block:: python + from ciscosparkapi import CiscoSparkAPI + + api = CiscoSparkAPI() + + # Find all of rooms that have 'ciscosparkapi Demo' in their title + all_rooms = api.rooms.list() + demo_rooms = [room for room in all_rooms if 'ciscosparkapi Demo' in room.title] + + # Delete all of the demo rooms + for room in demo_rooms: + api.rooms.delete(room.id) + + # Create a new demo room + demo_room = api.rooms.create('ciscosparkapi Demo') + + # Add people to the new demo room + email_addresses = ["test01@cmlccie.com", "test02@cmlccie.com"] + for email_address in email_addresses: + api.memberships.create(demo_room.id, personEmail=email_address) + + # Post a message to the new room, and upload a file + api.message.create(demo_room.id, text="Welcome to the room!", files=["welcome.jpg"]) + +That's at least six Spark API calls, and likely more than that depending on how rooms are returned by Spark (remember paging is handled for you automatically) and how many people you add to the room. All in only about 23 lines (which includes comments). + + +Installation +------------ + +ciscosparkapi is available on PyPI. Install it via PIP, or alternatively you can download the package from GitHub and install it via setuptools. + +**PIP Installation** +.. code-block:: bash + $ pip install ciscosparkapi + +**git / setuptools Installation** +.. code-block:: bash + $ git clone https://github.com/CiscoDevNet/ciscosparkapi.git + $ python ciscosparkapi/setup.py install + + +Releases & Release Notes +------------------------ + +Complete and usable *Beta* releases have been published for this package. + +While the package APIs may change while in beta, the package capabilities should all be functional. If you expereince any issues using this package, please report them using the issues_ log on the packages GitHub page. + +Please see the releases_ page for release notes on the incremental functionality and bug fixes that have been incorporated into the published releases. + + +Contribution +------------ + +ciscosparkapi_ and it's sister project ciscosparksdk_ are community development projects. Feedback, thoughts, ideas and code contributions are most welcome! + +To contribute to ciscosparkapi_ please use the following resources: +* Feedback, issues, thoughts and ideas... Please use the issues_ log. +* Interested in contributing code? + # Check for open issues_ or create a new one. + * Assign yourself to the issue you want to work on, and communicate with any others that may be working the issue. + * Project workflow is being managed via the GitHub projects_ feature. Move your issue to the 'In Progress' column of the project being worked. + # Review the project charter_ for coding standards and practices. + # Fork a copy of `the repository`_. + # Add your code to your forked repository. + # Submit a `pull request`_, and move your issue to the 'Code Review' column on the projects_ page. + + +.. _devloper.ciscospark.com: https://developer.ciscospark.com +.. _pages: https://developer.ciscospark.com/pagination.html +.. _PyCharm: https://www.jetbrains.com/pycharm/ +.. _ciscosparkapi: https://github.com/CiscoDevNet/ciscosparkapi +.. _ciscosparksdk: https://github.com/CiscoDevNet/ciscosparksdk +.. _issues: https://github.com/CiscoDevNet/ciscosparkapi/issues +.. _projects: https://github.com/CiscoDevNet/ciscosparkapi/projects +.. _pull requests: https://github.com/CiscoDevNet/ciscosparkapi/pulls +.. _releases: https://github.com/CiscoDevNet/ciscosparkapi/releases +.. _charter: https://github.com/CiscoDevNet/spark-python-packages-team/blob/master/Charter.md +.. _the repository: ciscosparkapi_ +.. _pull request: `pull requests`_ From 8f995911bc99588f48bba358c000a9f130c9bcad Mon Sep 17 00:00:00 2001 From: Chris Lunsford Date: Mon, 17 Oct 2016 15:10:45 -0400 Subject: [PATCH 2/7] Update README.rst #15 Correct formatting issues and simplify text. --- README.rst | 78 +++++++++++++++++++++++++++++++----------------------- 1 file changed, 45 insertions(+), 33 deletions(-) diff --git a/README.rst b/README.rst index eee7c55..81cf7de 100644 --- a/README.rst +++ b/README.rst @@ -6,9 +6,10 @@ ciscosparkapi Simple, lightweight and scalable Python API wrapper for the Cisco Spark APIs ---------------------------------------------------------------------------- -Sure, working with the Cisco Spark APIs is easy (see `devloper.ciscospark.com`_). They are RESTful, simply and naturally structured, require only a simple Access Token to authenticate, and the data is elegantly represented in intuitive JSON. What could be easier? +Sure, working with the Cisco Spark APIs is easy (see `devloper.ciscospark.com`_). They are *RESTful*, *naturally structured*, require only a *simple Access Token for authentication*, and *the data is elegantly represented in intuitive JSON*. What could be easier? .. code-block:: python + import requests URL = 'https://api.ciscospark.com/v1/messages' @@ -31,19 +32,20 @@ Sure, working with the Cisco Spark APIs is easy (see `devloper.ciscospark.com`_) # Oops something went wrong... Better do something about it. print(response.status_code, response.text) -Like I said, easy. However, it is rather repetitive... +Like I said, *EASY*. However, in use, the code can be rather repetitive... + +- You have to setup the environment every time +- You have to remember URLs and request arguments (or reference the docs) +- You have to parse the returned JSON and setup variables to hold the attributes you need +- When requesting lists of items, you have to deal with pagination_ -- You have to setup this same environment every time you want do something with just one of the API calls. -- ...and this was just posting a text message to Spark, what about when you want to do something like uploading a file? -- What if you your application or use case needs to make calls to several of the Spark APIs? -- This can turn into a lot of boiler plate code! -- Sure, you can consolidate the repetitious code into functions, and end up building your own library of functions (and many have done this!), or you can build upon the shoulders of those who have gone before you... -Enter **ciscosparkapi**, a simple API wrapper that wraps the RESTful Spark API calls and returned JSON objects within native Python objects and function calls. +Enter **ciscosparkapi**, a simple API wrapper that wraps all of the Spark API calls and returned JSON objects within native Python objects and function calls. -The above Python code can be consolidated to the following: +With ciscosparkapi, the above Python code can be consolidated to the following: .. code-block:: python + from ciscosparkapi import CiscoSparkAPI api = CiscoSparkAPI() # Creates a new API 'connection object' @@ -56,20 +58,22 @@ The above Python code can be consolidated to the following: The ciscosparkapi package handles all of this for you: -+ Your Spark access token can automatically be retrieved from a ``SPARK_ACCESS_TOKEN`` environment variable. *You don't have to provide it every time you create a new API connection object.* -+ *You don't have to remember the API endpoint URLs or JSON parameters.* They have been wrapped in native Python methods. -+ If your Python IDE supports **auto-completion** (like PyCharm_), *you can simply navigate the available methods and object attributes right within your IDE*. -+ The JSON objects returned from the Cisco Spark cloud are modeled as native Python objects, which also support auto-completion and native attribute access. *You don't have to think about parsing the JSON objects or working with dictionaries or creating lots of variables to hold the returned object's attributes. You can simply interact with the returned object as a native Python object.* -+ When requesting 'lists of objects' from Spark, like enumerating the messages in a room or a list of rooms of which you are a member, you don't have think about handling and requesting pages_ of responses. These are simply and efficiently abstracted and requested as needed - as you access the returned objects. *Other than a slight delay as additional objects are requested from the Spark cloud, you won't have to deal with or think about pages of responses.* ++ Reads your Spark access token from a ``SPARK_ACCESS_TOKEN`` environment variable ++ Wraps and represents all Spark API calls as a simple hierarchal tree of native-Python methods (with default arguments everywhere possible) ++ If your Python IDE supports **auto-completion** (like PyCharm_), you can navigate the available methods and object attributes right within your IDE ++ Represents all returned JSON objects as native Python objects - you can access all of the object's attributes using native *dot-syntax* ++ **Automatic and Transparent Pagination!** When requesting 'lists of objects' from Spark, requests for additional pages of responses are efficiently and automatically requested as needed ++ Multipart encoding and uploading of local files, when creating messages with local file attachements -...which lets you do powerful things simply: +All of this, combined, lets you do powerful things simply: .. code-block:: python + from ciscosparkapi import CiscoSparkAPI api = CiscoSparkAPI() - # Find all of rooms that have 'ciscosparkapi Demo' in their title + # Find all rooms that have 'ciscosparkapi Demo' in their title all_rooms = api.rooms.list() demo_rooms = [room for room in all_rooms if 'ciscosparkapi Demo' in room.title] @@ -82,28 +86,32 @@ The ciscosparkapi package handles all of this for you: # Add people to the new demo room email_addresses = ["test01@cmlccie.com", "test02@cmlccie.com"] - for email_address in email_addresses: - api.memberships.create(demo_room.id, personEmail=email_address) + for email in email_addresses: + api.memberships.create(demo_room.id, personEmail=email) # Post a message to the new room, and upload a file api.message.create(demo_room.id, text="Welcome to the room!", files=["welcome.jpg"]) -That's at least six Spark API calls, and likely more than that depending on how rooms are returned by Spark (remember paging is handled for you automatically) and how many people you add to the room. All in only about 23 lines (which includes comments). +That's more than six Spark API calls in less than 23 lines of script code (with comments)! +...and likely more than that depending on how many rooms are returned by Spark (remember pagination is handled for you automatically) Installation ------------ -ciscosparkapi is available on PyPI. Install it via PIP, or alternatively you can download the package from GitHub and install it via setuptools. +ciscosparkapi is available on PyPI. Installation and updating of ciscosparkapi is easy: + +**Install via PIP** -**PIP Installation** .. code-block:: bash + $ pip install ciscosparkapi -**git / setuptools Installation** +**Upgrading to the latest Version** + .. code-block:: bash - $ git clone https://github.com/CiscoDevNet/ciscosparkapi.git - $ python ciscosparkapi/setup.py install + + $ pip install ciscosparkapi --upgrade Releases & Release Notes @@ -111,7 +119,7 @@ Releases & Release Notes Complete and usable *Beta* releases have been published for this package. -While the package APIs may change while in beta, the package capabilities should all be functional. If you expereince any issues using this package, please report them using the issues_ log on the packages GitHub page. +While the package APIs may change while in beta, the package capabilities should all be functional. If you expereince any issues using this package, please report them using the issues_ log. Please see the releases_ page for release notes on the incremental functionality and bug fixes that have been incorporated into the published releases. @@ -122,19 +130,23 @@ Contribution ciscosparkapi_ and it's sister project ciscosparksdk_ are community development projects. Feedback, thoughts, ideas and code contributions are most welcome! To contribute to ciscosparkapi_ please use the following resources: + * Feedback, issues, thoughts and ideas... Please use the issues_ log. * Interested in contributing code? - # Check for open issues_ or create a new one. - * Assign yourself to the issue you want to work on, and communicate with any others that may be working the issue. - * Project workflow is being managed via the GitHub projects_ feature. Move your issue to the 'In Progress' column of the project being worked. - # Review the project charter_ for coding standards and practices. - # Fork a copy of `the repository`_. - # Add your code to your forked repository. - # Submit a `pull request`_, and move your issue to the 'Code Review' column on the projects_ page. + + #. Check for open issues_ or create a new one. + + * Assign yourself to the issue you want to work on, and communicate with any others that may be working the issue. + * Project workflow is being managed via the GitHub projects_ feature. Move your issue to the 'In Progress' column of the project being worked. + + #. Review the project charter_ for coding standards and practices. + #. Fork a copy of `the repository`_. + #. Add your code to your forked repository. + #. Submit a `pull request`_, and move your issue to the 'Code Review' column on the projects_ page. .. _devloper.ciscospark.com: https://developer.ciscospark.com -.. _pages: https://developer.ciscospark.com/pagination.html +.. _pagination: https://developer.ciscospark.com/pagination.html .. _PyCharm: https://www.jetbrains.com/pycharm/ .. _ciscosparkapi: https://github.com/CiscoDevNet/ciscosparkapi .. _ciscosparksdk: https://github.com/CiscoDevNet/ciscosparksdk From 049e48a4c9e6a3f5c16718dd1ed0b5d762eb0251 Mon Sep 17 00:00:00 2001 From: Chris Lunsford Date: Mon, 17 Oct 2016 15:39:20 -0400 Subject: [PATCH 3/7] Update README.rst #15 Add examples and documentation sections and misc edits. --- README.rst | 69 +++++++++++++++++++++++++++++++++++++++++++----------- 1 file changed, 55 insertions(+), 14 deletions(-) diff --git a/README.rst b/README.rst index 81cf7de..421048d 100644 --- a/README.rst +++ b/README.rst @@ -6,7 +6,10 @@ ciscosparkapi Simple, lightweight and scalable Python API wrapper for the Cisco Spark APIs ---------------------------------------------------------------------------- -Sure, working with the Cisco Spark APIs is easy (see `devloper.ciscospark.com`_). They are *RESTful*, *naturally structured*, require only a *simple Access Token for authentication*, and *the data is elegantly represented in intuitive JSON*. What could be easier? +.. image:: https://img.shields.io/pypi/v/ciscosparkapi.svg + :target: https://pypi.python.org/pypi/ciscosparkapi + +Sure, working with the Cisco Spark APIs is easy (see `developer.ciscospark.com`_). They are *RESTful*, *naturally structured*, require only a *simple Access Token for authentication*, and *the data is elegantly represented in intuitive JSON*. What could be easier? .. code-block:: python @@ -59,11 +62,11 @@ With ciscosparkapi, the above Python code can be consolidated to the following: The ciscosparkapi package handles all of this for you: + Reads your Spark access token from a ``SPARK_ACCESS_TOKEN`` environment variable -+ Wraps and represents all Spark API calls as a simple hierarchal tree of native-Python methods (with default arguments everywhere possible) ++ Wraps and represents all Spark API calls as a simple hierarchical tree of native-Python methods (with default arguments provided everywhere possible!) + If your Python IDE supports **auto-completion** (like PyCharm_), you can navigate the available methods and object attributes right within your IDE -+ Represents all returned JSON objects as native Python objects - you can access all of the object's attributes using native *dot-syntax* ++ Represents all returned JSON objects as native Python objects - you can access all of the object's attributes using native *dot.syntax* + **Automatic and Transparent Pagination!** When requesting 'lists of objects' from Spark, requests for additional pages of responses are efficiently and automatically requested as needed -+ Multipart encoding and uploading of local files, when creating messages with local file attachements ++ Multipart encoding and uploading of local files, when creating messages with local file attachments All of this, combined, lets you do powerful things simply: @@ -99,7 +102,7 @@ That's more than six Spark API calls in less than 23 lines of script code (with Installation ------------ -ciscosparkapi is available on PyPI. Installation and updating of ciscosparkapi is easy: +Installation and updating of ciscosparkapi is easy: **Install via PIP** @@ -117,37 +120,75 @@ ciscosparkapi is available on PyPI. Installation and updating of ciscosparkapi Releases & Release Notes ------------------------ -Complete and usable *Beta* releases have been published for this package. +Complete and usable *Beta* releases_ have been published for this package. -While the package APIs may change while in beta, the package capabilities should all be functional. If you expereince any issues using this package, please report them using the issues_ log. +While the package APIs may change (while the package is in beta), the package capabilities should all be functional. If you experience any issues using this package, please report them using the issues_ log. Please see the releases_ page for release notes on the incremental functionality and bug fixes that have been incorporated into the published releases. +Examples +-------- + +Looking for some examples or sample scripts? Check out the examples_ folder! + +Have a good example script you would like to share? Please feel free to contribute! + + +Documentation +------------- + +All of the user-facing methods have complete docstrings. You can view the docstrings for any method either from the `source files`_, or by using the Python ``help()`` function. + +.. code-block:: python + + >> from ciscosparkapi import CiscoSparkAPI + >> api = CiscoSparkAPI() + >> help(api.messages.create) + create(self, roomId=None, toPersonId=None, toPersonEmail=None, text=None, markdown=None, files=None) method of ciscosparkapi.api.messages.MessagesAPI instance + Posts a message to a room. + + Posts a message, and optionally, a media content attachment, to a room. + + You must specify either a roomId, toPersonId or toPersonEmail when + posting a message, and you must supply some message content (text, + markdown, files). + + Args: + roomId(string_types): The room ID. + toPersonId(string_types): The ID of the recipient when sending a + private 1:1 message. + ... + +Full standalone online documentation is coming soon (it's on the backlog!). + + Contribution ------------ ciscosparkapi_ and it's sister project ciscosparksdk_ are community development projects. Feedback, thoughts, ideas and code contributions are most welcome! -To contribute to ciscosparkapi_ please use the following resources: +To contribute to ciscosparkapi please use the following resources: * Feedback, issues, thoughts and ideas... Please use the issues_ log. * Interested in contributing code? - #. Check for open issues_ or create a new one. + 1. Check for open issues_ or create a new one. * Assign yourself to the issue you want to work on, and communicate with any others that may be working the issue. * Project workflow is being managed via the GitHub projects_ feature. Move your issue to the 'In Progress' column of the project being worked. - #. Review the project charter_ for coding standards and practices. - #. Fork a copy of `the repository`_. - #. Add your code to your forked repository. - #. Submit a `pull request`_, and move your issue to the 'Code Review' column on the projects_ page. + 2. Review the project charter_ for coding standards and practices. + 3. Fork a copy of `the repository`_. + 4. Add your code to your forked repository. + 5. Submit a `pull request`_, and move your issue to the 'Code Review' column on the projects_ page. -.. _devloper.ciscospark.com: https://developer.ciscospark.com +.. _developer.ciscospark.com: https://developer.ciscospark.com .. _pagination: https://developer.ciscospark.com/pagination.html .. _PyCharm: https://www.jetbrains.com/pycharm/ +.. _examples: https://github.com/CiscoDevNet/ciscosparkapi/tree/master/examples +.. _source files: https://github.com/CiscoDevNet/ciscosparkapi/tree/master/ciscosparkapi .. _ciscosparkapi: https://github.com/CiscoDevNet/ciscosparkapi .. _ciscosparksdk: https://github.com/CiscoDevNet/ciscosparksdk .. _issues: https://github.com/CiscoDevNet/ciscosparkapi/issues From 6bce86c1db244ecd35e46a50933ba1ee5cf031da Mon Sep 17 00:00:00 2001 From: Chris Lunsford Date: Mon, 17 Oct 2016 15:42:11 -0400 Subject: [PATCH 4/7] Switch package to use new README.rst #15 Delete old README.md and point setup.py to new README.rst --- README.md | 73 ------------------------------------------------------- setup.py | 2 +- 2 files changed, 1 insertion(+), 74 deletions(-) delete mode 100644 README.md diff --git a/README.md b/README.md deleted file mode 100644 index 0d1ccb1..0000000 --- a/README.md +++ /dev/null @@ -1,73 +0,0 @@ -# ciscosparkapi -Simple, lightweight and scalable Python API wrapper for the Cisco Spark APIs - -## Overview -Provides single Pythonic wrapper class that represents the Cisco Spark API interfaces and returned JSON objects as native Python objects. - - * Supports Python versions 2 and 3. - - * Leverages generator containers and RFC5988 web linking to provide simple and efficient 'paging' of Cisco Spark data objects returned by the Cisco Spark cloud. - - * All Cisco Spark JSON objects and attributes are represented as native python objects. - * As new Cisco Spark attributes are added and returned by the Spark cloud service, they will be automatically available in the respective Python objects - no library update required. - * New object types can be quickly created and modeled by via the generic SparkData class, or you can easily subclass SparkData to provide additional functionality. - - * The CiscoSparkAPI class facilitates the creation of simple 'connection objects.' All API calls are wrapped by this single class, and are available via a simple hierarchical structure - like CiscoSparkAPI.rooms.list(). - * Argument defaults are provided to make getting connected simple, and can be easily overridden if needed. - * The only setting required to get connected is your Cisco Spark Access Token (see [developer.ciscospark.com](https://developer.ciscospark.com/getting-started.html)). When creating a new CiscoSparkAPI object, you may provide your access token one of two ways: - 1. By setting a SPARK_ACCESS_TOKEN environment variable. - 2. Via the ```CiscoSparkAPI(access_token="")``` argument. - * All API calls are provided as simple method calls on the API connection objects. - - -## Installation -ciscosparkapi is available on PyPI. Install it via PIP, or alternatively you can download the package from GitHub and install it via setuptools. - -**PIP Installation** -``` -$ pip install ciscosparkapi -``` - -**git / setuptools Installation** -``` -$ git clone https://github.com/CiscoDevNet/ciscosparkapi.git -$ python setup.py install -``` - -## Examples - -```python -from ciscosparkapi import CiscoSparkAPI - - -# By default retrieves your access token from the SPARK_ACCESS_TOKEN environement variable -api = CiscoSparkAPI() - - -rooms = api.rooms.list() # Returns an generator container providing support for RFC5988 paging -for room in rooms: # Efficiently iterates through returned objects - print room.title # JSON objects are represented as native Python objects - - -# Creating a list from the returned generator containers is easy -teams = api.teams.list() -team_list = list(teams) -print teams_list -``` - - -## Current Status -**Beta(s) Released!** - -Please check the [releases page](https://github.com/CiscoDevNet/ciscosparkapi/releases) for details on the latest releases. - -We have released the first beta distributions for this package! Please test out the package for your use cases, and raise [issues](https://github.com/CiscoDevNet/ciscosparkapi/issues) for any problems you encounter. Also, **PLEASE** create new [issues](https://github.com/CiscoDevNet/ciscosparkapi/issues) to provide any feedback on the package API structure (names, method calls and etc.). The package APIs are still subject to change, and we would like to get these nailed down before we release v1 for the package. - - -## Community Development Project Information -This is a collaborative community development project working to create two packages to be published to the Python Package Index: - - 1. [**ciscosparkapi**](https://github.com/CiscoDevNet/ciscosparkapi) - Simple, lightweight and scalable Python API wrapper for the Cisco Spark APIs - 2. [**ciscosparksdk**](https://github.com/CiscoDevNet/ciscosparksdk) - Additional features and functionality useful to developers building on Cisco Spark API - -Contributions and feedback are welcome. Information on contributing this project can be found [here in the project Charter](https://github.com/CiscoDevNet/spark-python-packages-team/blob/master/Charter.md). diff --git a/setup.py b/setup.py index 3f88847..4acb953 100644 --- a/setup.py +++ b/setup.py @@ -13,7 +13,7 @@ # Get the long description from the README file -with open(path.join(here, 'README.md'), encoding='utf-8') as f: +with open(path.join(here, 'README.rst'), encoding='utf-8') as f: long_description = f.read() From c415ecc84e0d54602397c0c711edd66d0b1fe960 Mon Sep 17 00:00:00 2001 From: Chris Lunsford Date: Mon, 17 Oct 2016 15:44:14 -0400 Subject: [PATCH 5/7] README.rst Update Sub-Title --- README.rst | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/README.rst b/README.rst index 421048d..d0a26f3 100644 --- a/README.rst +++ b/README.rst @@ -2,9 +2,9 @@ ciscosparkapi ============= ----------------------------------------------------------------------------- -Simple, lightweight and scalable Python API wrapper for the Cisco Spark APIs ----------------------------------------------------------------------------- +------------------------------------------------------------------------- +Simple, lightweight, scalable Python API wrapper for the Cisco Spark APIs +------------------------------------------------------------------------- .. image:: https://img.shields.io/pypi/v/ciscosparkapi.svg :target: https://pypi.python.org/pypi/ciscosparkapi From a19a975adedd81cb5233453a2bfa4eb4b9239635 Mon Sep 17 00:00:00 2001 From: Chris Lunsford Date: Mon, 17 Oct 2016 15:48:37 -0400 Subject: [PATCH 6/7] Update README.rst --- README.rst | 25 +++++++++++++------------ 1 file changed, 13 insertions(+), 12 deletions(-) diff --git a/README.rst b/README.rst index d0a26f3..074b033 100644 --- a/README.rst +++ b/README.rst @@ -142,9 +142,9 @@ All of the user-facing methods have complete docstrings. You can view the docst .. code-block:: python - >> from ciscosparkapi import CiscoSparkAPI - >> api = CiscoSparkAPI() - >> help(api.messages.create) + >>> from ciscosparkapi import CiscoSparkAPI + >>> api = CiscoSparkAPI() + >>> help(api.messages.create) create(self, roomId=None, toPersonId=None, toPersonEmail=None, text=None, markdown=None, files=None) method of ciscosparkapi.api.messages.MessagesAPI instance Posts a message to a room. @@ -170,18 +170,19 @@ ciscosparkapi_ and it's sister project ciscosparksdk_ are community development To contribute to ciscosparkapi please use the following resources: -* Feedback, issues, thoughts and ideas... Please use the issues_ log. -* Interested in contributing code? +Feedback, issues, thoughts and ideas... Please use the issues_ log. - 1. Check for open issues_ or create a new one. +Interested in contributing code? - * Assign yourself to the issue you want to work on, and communicate with any others that may be working the issue. - * Project workflow is being managed via the GitHub projects_ feature. Move your issue to the 'In Progress' column of the project being worked. +#. Check for open issues_ or create a new one. - 2. Review the project charter_ for coding standards and practices. - 3. Fork a copy of `the repository`_. - 4. Add your code to your forked repository. - 5. Submit a `pull request`_, and move your issue to the 'Code Review' column on the projects_ page. + * Assign yourself to the issue you want to work on, and communicate with any others that may be working the issue. + * Project workflow is being managed via the GitHub projects_ feature. Move your issue to the 'In Progress' column of the project being worked. + +#. Review the project charter_ for coding standards and practices. +#. Fork a copy of `the repository`_. +#. Add your code to your forked repository. +#. Submit a `pull request`_, and move your issue to the 'Code Review' column on the projects_ page. .. _developer.ciscospark.com: https://developer.ciscospark.com From eb3342947a612fc81771f39992264630b9236f63 Mon Sep 17 00:00:00 2001 From: Chris Lunsford Date: Mon, 17 Oct 2016 15:50:05 -0400 Subject: [PATCH 7/7] Update README.rst --- README.rst | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/README.rst b/README.rst index 074b033..9d8bd79 100644 --- a/README.rst +++ b/README.rst @@ -51,12 +51,12 @@ With ciscosparkapi, the above Python code can be consolidated to the following: from ciscosparkapi import CiscoSparkAPI - api = CiscoSparkAPI() # Creates a new API 'connection object' + api = CiscoSparkAPI() try: - message = api.messages.create('', text='') # Creates a new message and raises an exception if something goes wrong. + message = api.messages.create('', text='') print("New message created, with ID:", message.id) print(message.text) - except SparkApiError as e: # Handles the exception, if something goes wrong + except SparkApiError as e: print(e) The ciscosparkapi package handles all of this for you: