Commit Graph

2409 Commits

Author SHA1 Message Date
Francisco Reveriano 26e3a28bee Update train.py for distributive programming (#655)
When attempting to running this function in a multi-GPU environment I kept on getting a runtime issue. I was able to solve this problem by passing this keyword. I first found the solution here: 
https://github.com/pytorch/pytorch/issues/22436
and in the pytorch tutorial

'RuntimeError: Expected to have finished reduction in the prior iteration before starting a new one. This error indicates that your module has parameters that were not used in producing loss. You can enable unused parameter detection by (1) passing the keyword argument find_unused_parameters=True to torch.nn.parallel.DistributedDataParallel; (2) making sure all forward function outputs participate in calculating loss. If you already have done the above two steps, then the distributed data parallel module wasn't able to locate the output tensors in the return value of your module's forward function. Please include the loss function and the structure of the return value of forward of your module when reporting this issue (e.g. list, dict, iterable). '
2019-11-24 22:21:36 -10:00
Glenn Jocher a0ef217842 updates 2019-11-24 20:10:39 -10:00
Glenn Jocher 9b55bbf9e2 updates 2019-11-24 20:08:24 -10:00
Glenn Jocher 7773651e8e updates 2019-11-24 18:38:30 -10:00
Glenn Jocher 2f1c9a3f6f updates 2019-11-24 18:31:06 -10:00
Glenn Jocher f12a2a513a updates 2019-11-24 18:29:29 -10:00
Glenn Jocher 5f00d7419e updates 2019-11-23 19:27:33 -10:00
Glenn Jocher 4aff400777 updates 2019-11-23 19:23:31 -10:00
Glenn Jocher b027c66048 updates 2019-11-23 13:34:37 -10:00
Glenn Jocher 6c6aa483d7 updates 2019-11-23 13:23:38 -10:00
Glenn Jocher 46161ed94d updates 2019-11-23 12:09:46 -10:00
Glenn Jocher 55a6b05228 updates 2019-11-23 09:35:11 -10:00
Glenn Jocher bdf11ffdf1 updates 2019-11-23 09:25:21 -10:00
Glenn Jocher d623a425d9 updates 2019-11-22 16:20:11 -10:00
Glenn Jocher f1e8d23d39 updates 2019-11-22 14:36:49 -10:00
Glenn Jocher 4c61611ce0 updates 2019-11-22 14:20:35 -10:00
Glenn Jocher a137c21dc0 updates 2019-11-22 14:06:16 -10:00
Glenn Jocher 54d907d8c8 updates 2019-11-22 14:03:46 -10:00
Glenn Jocher 46da9fd26c updates 2019-11-22 13:38:28 -10:00
Glenn Jocher bbd6c884e6 updates 2019-11-22 13:27:23 -10:00
Glenn Jocher e701979862 updates 2019-11-22 13:03:29 -10:00
Glenn Jocher 3834b77961 updates 2019-11-21 11:52:48 -08:00
Glenn Jocher 7c59715fda updates 2019-11-21 00:00:17 -08:00
Glenn Jocher f38723c0bd updates 2019-11-20 19:34:22 -08:00
Glenn Jocher a0067ac8fb updates 2019-11-20 19:10:36 -08:00
Glenn Jocher 74b57500c7 updates 2019-11-20 16:02:57 -08:00
Glenn Jocher 3a4ed8b3ab updates 2019-11-20 13:40:24 -08:00
Glenn Jocher bb209111c4 updates 2019-11-20 13:36:15 -08:00
Glenn Jocher 8e327e3bd0 updates 2019-11-20 13:33:25 -08:00
Glenn Jocher 2950f4c816 updates 2019-11-20 13:26:50 -08:00
Glenn Jocher c14ea59c71 updates 2019-11-20 13:24:50 -08:00
Glenn Jocher bd498ae776 updates 2019-11-20 13:14:24 -08:00
Glenn Jocher bac4cc58fd updates 2019-11-20 12:51:05 -08:00
Glenn Jocher e58f0a68b6 updates 2019-11-20 12:05:40 -08:00
Glenn Jocher 429d44282c updates 2019-11-19 20:42:44 -08:00
Glenn Jocher 253e746d30 updates 2019-11-19 19:00:40 -08:00
Glenn Jocher d355e539d9 updates 2019-11-19 18:47:22 -08:00
Glenn Jocher d94b6e88e3 updates 2019-11-19 18:16:35 -08:00
Glenn Jocher d9805d2fb6 updates 2019-11-19 12:42:12 -08:00
Glenn Jocher b758b9c76e updates 2019-11-18 15:01:33 -08:00
Glenn Jocher 2ba1a4c9cc updates 2019-11-18 12:01:17 -08:00
Glenn Jocher 7ebb7d1310 updates 2019-11-18 10:15:17 -08:00
Glenn Jocher 9c716a39c3 updates 2019-11-17 19:00:12 -08:00
Glenn Jocher a1151c04a7 updates 2019-11-17 18:48:50 -08:00
Glenn Jocher b4a71d0588 updates 2019-11-17 17:17:52 -08:00
Glenn Jocher bb936f758a updates 2019-11-17 12:21:59 -08:00
Glenn Jocher eb32fca702 updates 2019-11-16 22:09:31 -08:00
Glenn Jocher 0466285f59 updates 2019-11-16 22:09:15 -08:00
Glenn Jocher dc82956aff updates 2019-11-16 13:12:56 -08:00
Glenn Jocher 84cb744761 updates 2019-11-16 12:34:38 -08:00