rllib use custom registered environments

别来无恙 提交于 2020-07-21 07:03:25

问题


Rllib docs provide some information about how to create and train a custom environment. There is some information about registering that environment, but I guess it needs to work differently than gym registration.

I'm testing this out working with the SimpleCorridor environment. If I add the registration code to the file like so:

from ray.tune.registry import register_env

class SimpleCorridor(gym.Env):
   ...


def env_creator(env_config):
    return SimpleCorridor(env_config)

register_env("corridor", env_creator)

Then I am able to train an algorithm using the string name no problem:

if __name__ == "__main__":
    ray.init()
    tune.run(
        "PPO",
        stop={
            "timesteps_total": 10000,
        },
        config={
            "env": "corridor", # <--- This works fine!
            "env_config": {
                "corridor_length": 5,
            },
        },
    )

However

It is kinda pointless to register the environment in the same file that you define the environment because you can just use the class. OpenAI gym registration is nice because if you install the environment, then you can use it anywhere just by writing

include gym_corridor

It's not clear to me if there is a way to do the same thing for registering environments for rllib. Is there a way to do this?


回答1:


The registry functions in ray are a massive headache; I don't know why they can't recognize other environments like OpenAI Gym.

Anyway, the way I've solved this is by wrapping my custom environments in another function that imports the environment automatically so I can re-use code. For example:

def env_creator(env_name):
    if env_name == 'CustomEnv-v0':
        from custom_gym.envs.custom_env import CustomEnv0 as env
    elif env_name == 'CustomEnv-v1':
        from custom_gym.envs.custom_env import CustomEnv1 as env
    else:
        raise NotImplementedError
    return env

Then, to get it to work with the tune.register_env(), you can use your custom env with a lambda function:

env = env_creator('CustomEnv-v0')
tune.register_env('myEnv', lambda: config, env(config))

From there, tune.run() should work. It's annoying, but that's the best way I've found to work around this registry issue.



来源:https://stackoverflow.com/questions/58551029/rllib-use-custom-registered-environments

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!