Skip to content

Conversation

matteobettini
Copy link
Contributor

@matteobettini matteobettini commented Jul 18, 2023

This PR will be a superset of #1365, which should be closed.
It tackles tasks from #969
Together with #1391 it becomes a superset of #1204 which should be eventually closed

It fixes #1343

Methods missing are due to time constraint and we can implement them in the future on a need basis

Fixes operations in lazy stack composite specs:

  • unsqueeze
  • squeeze
  • print (it behaves like LazyStackedTd)
  • __len__
  • __eq__
  • __ne__
  • project
  • type_check
  • __delitem__
  • __iter__
  • keys
  • __setitem__
  • update
  • is_in
  • unbind

Missing:

  • encode

Bonus, some operations in lazy stacked specs:

  • shape
  • stack only if same spec type, dtype and ndims
  • print
  • __len__
  • __eq__
  • squeeze
  • unsqueeze
  • unbind

Missing:

  • is_in
  • project
  • type_check
  • encode

vmoens and others added 10 commits July 6, 2023 14:56
Signed-off-by: Matteo Bettini <[email protected]>
Signed-off-by: Matteo Bettini <[email protected]>
Signed-off-by: Matteo Bettini <[email protected]>
Signed-off-by: Matteo Bettini <[email protected]>
Signed-off-by: Matteo Bettini <[email protected]>
@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Jul 18, 2023
@matteobettini matteobettini changed the base branch from fix_compositespec to main July 18, 2023 09:57
Signed-off-by: Matteo Bettini <[email protected]>
@matteobettini
Copy link
Contributor Author

matteobettini commented Jul 18, 2023

@vmoens for print of het dicts what about something like

LazyStackedCompositeSpec(
    camera: BoundedTensorSpec(
        shape=torch.Size([3, 32, 32, 3]),
        space=ContinuousBox(
            minimum=Tensor(shape=torch.Size([3, 32, 32, 3]), device=cpu, dtype=torch.float32, contiguous=True), 
            maximum=Tensor(shape=torch.Size([3, 32, 32, 3]), device=cpu, dtype=torch.float32, contiguous=True)),
        device=cpu,
        dtype=torch.float32,
        domain=continuous),
    vector: LazyStackedTensorSpec(
         shape=torch.Size([3, -1]), space=None, device=cpu, dtype=torch.float32, domain=continuous), device=cpu, shape=torch.Size([3]),
    
   lazy_keys={
        0 -> lidar: UnboundedContinuousTensorSpec(
                  shape=torch.Size([20]),
                  device=cpu,
                  dtype=torch.float32,
                  domain=continuous),
              obs_0: UnboundedContinuousTensorSpec(
                  shape=torch.Size([1]),
                  device=cpu,
                  dtype=torch.float32,
                  domain=continuous),
        1 -> lidar: UnboundedContinuousTensorSpec(
                  shape=torch.Size([20]),
                  device=cpu,
                  dtype=torch.float32,
                  domain=continuous),
              obs_1: UnboundedContinuousTensorSpec(
                  shape=torch.Size([ 1,2]),
                  device=cpu,
                  dtype=torch.float32,
                  domain=continuous),
        2 ->  obs_2: UnboundedContinuousTensorSpec(
                  shape=torch.Size([ 1,2,3]),
                  device=cpu,
                  dtype=torch.float32,
                  domain=continuous),
    }

 device=cpu, shape=torch.Size([3]))

Signed-off-by: Matteo Bettini <[email protected]>
Signed-off-by: Matteo Bettini <[email protected]>
Signed-off-by: Matteo Bettini <[email protected]>
Signed-off-by: Matteo Bettini <[email protected]>
Signed-off-by: Matteo Bettini <[email protected]>
Signed-off-by: Matteo Bettini <[email protected]>
Signed-off-by: Matteo Bettini <[email protected]>
Signed-off-by: Matteo Bettini <[email protected]>
Signed-off-by: Matteo Bettini <[email protected]>
Signed-off-by: Matteo Bettini <[email protected]>
Signed-off-by: Matteo Bettini <[email protected]>
Signed-off-by: Matteo Bettini <[email protected]>
@matteobettini matteobettini marked this pull request as ready for review July 19, 2023 15:59
@matteobettini matteobettini marked this pull request as draft July 19, 2023 15:59
Signed-off-by: Matteo Bettini <[email protected]>
@matteobettini matteobettini marked this pull request as ready for review July 19, 2023 16:41
Signed-off-by: Matteo Bettini <[email protected]>
Signed-off-by: Matteo Bettini <[email protected]>
Signed-off-by: Matteo Bettini <[email protected]>
Signed-off-by: Matteo Bettini <[email protected]>
@vmoens vmoens added the bug Something isn't working label Jul 27, 2023
@vmoens vmoens changed the title [BugFix] Fix LazyStackedCompositeSpec [BugFix] Fix LazyStackedCompositeSpec Jul 27, 2023
Copy link
Collaborator

@vmoens vmoens left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great work!
Aside from the few comments I left, I noticed that the new classes are not part of the doc.

matteobettini and others added 6 commits July 28, 2023 09:34
Signed-off-by: Matteo Bettini <[email protected]>
Signed-off-by: Matteo Bettini <[email protected]>
Signed-off-by: Matteo Bettini <[email protected]>
@matteobettini matteobettini changed the title [BugFix] Fix LazyStackedCompositeSpec [BugFix] Fix LazyStackedCompositeSpec and introducing consolidate_spec Jul 28, 2023
Signed-off-by: Matteo Bettini <[email protected]>
Signed-off-by: Matteo Bettini <[email protected]>
Signed-off-by: Matteo Bettini <[email protected]>
Signed-off-by: Matteo Bettini <[email protected]>
Signed-off-by: Matteo Bettini <[email protected]>
Signed-off-by: Matteo Bettini <[email protected]>
Copy link
Collaborator

@vmoens vmoens left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM
Do the boxes that are unticked raising errors when called?

Copy link
Collaborator

@vmoens vmoens left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM
Do the boxes that are unticked raising errors when called?

@matteobettini
Copy link
Contributor Author

LGTM Do the boxes that are unticked raising errors when called?

i ll find out and fix if needed

Signed-off-by: Matteo Bettini <[email protected]>
Copy link
Collaborator

@vmoens vmoens left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM Thanks!

@vmoens vmoens merged commit 575a0a4 into pytorch:main Jul 31, 2023
@matteobettini matteobettini deleted the fix_compositespec branch September 4, 2023 08:31
vmoens added a commit to hyerra/rl that referenced this pull request Oct 10, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[BUG] Operations on Spec of Stacked CompositeSpec with different keys crashes

3 participants