On 8 Oct 2019, at 17:07, José Suárez-Varela
<jsuarezv(a)ac.upc.edu> wrote:
Hi Nathan,
In the ACM SOSR paper we only train the model with 260,000 training samples from the NSF
network topology and evaluate it on 100,000 samples simulated in the GBN and GEANT2
topologies. You can find the datasets used in this paper at the following link:
https://github.com/knowledgedefinednetworking/Unveiling-the-potential-of-GN…
Please, do not confuse these datasets with the ones that we used in our ACM SIGCOMM demo
paper ("Challenging the generalization capabilities of Graph Neural Networks for
network modeling"), which are on this link:
https://github.com/knowledgedefinednetworking/NetworkModelingDatasets/tree/…
We made evaluations (internally) to train the jitter model from scratch and it works
perfectly. However, in the ACM SOSR paper we wanted to show the possibility to make
transfer learning from a model trained (in an early stage) to learn the delay and retrain
it to model the jitter. This typically enables to save training time.
Regards,
José
El 6/10/19 a las 17:26, Nathan Sowatskey escribió:
> Jose, following up now that I have the ACM version of the paper.
>
> I can see that you are testing with both the GBN and Geant2 networks.
>
> You also appear to say that you train only with the NSF network, and so you do not
train with the synth50bw network. Is that correct?
>
> Also, it looks like you have not trained a jitter model from scratch, as you
explained that the jitter model "was trained from a model previously trained for the
delay”. Training a jitter model from scratch is one of the aspects I should like to
explore, so I wanted to understand this aspect better.
>
> Many thanks
>
> Nathan
>
>> On 25 Sep 2019, at 14:17, Nathan Sowatskey <nathan(a)nathan.to> wrote:
>>
>> Great, thanks for this. I am trying to get the ACM version of the paper now.
>>
>> Regards
>>
>> Nathan
>>
>>> On 25 Sep 2019, at 11:55, Jose Suárez-Varela <jsuarezv(a)ac.upc.edu>
wrote:
>>>
>>> Dear Nathan,
>>>
>>> Probably you read our work-in-progress version uploaded at ArXiv. Please,
check the last version published in the proceedings of ACM SOSR
(
https://dl.acm.org/citation.cfm?id=3314357). Here, we make the evaluation also in GBN.
>>>
>>> Sorry for the possible misunderstanding. We uploaded the README page
(
https://github.com/knowledgedefinednetworking/Unveiling-the-potential-of-GN…)
to provide the link to ACM SOSR.
>>>
>>> Regarding the GBN topology, unfortunately we didn't prepare a figure.
However, you can find an image of this topology at the following paper (Figure 4):
>>>
>>> J. Pedro, J. Santos, and J. Pires, “Performance evaluation of integrated
otn/dwdm networks with single-stage multiplexing of optical channel data units,” in
Proceedings of ICTON, 2011, pp. 1–4.
>>>
>>> I hope it will be useful.
>>>
>>>
>>> Regards,
>>>
>>> José
>>>
>>>
>>> El 25/09/2019 a las 12:14, Nathan Sowatskey escribió:
>>>> Thank you Jose. I have read the paper (many times :-)). I have seen the
details of the evaluation with the Geant2 network, but there is no mention of the GBN
network in the paper.
>>>>
>>>> I am perfectly comfortable with processing the data (you can see my code
here:
https://github.com/Data-Science-Projects/demo-routenet).
>>>>
>>>> Specifically for the GBN network, I wanted to see what the topology looks
like. I have the NED file, but I can’t use that NED file with OMNet (for reasons discussed
elsewhere).
>>>>
>>>> I can, of course, manually reverse engineer the NED file. But I wanted to
ask if there was already a topology diagram just to save me the effort.
>>>>
>>>> Regards
>>>>
>>>> Nathan
>>>>
>>>>> On 25 Sep 2019, at 11:07, Jose Suárez-Varela
<jsuarezv(a)ac.upc.edu> wrote:
>>>>>
>>>>> Hello Nathan,
>>>>>
>>>>> All these datasets where used in our paper:
>>>>>
>>>>> Krzysztof Rusek, José Suárez-Varela, Albert Mestres, Pere Barlet-Ros,
Albert Cabellos-Aparicio; "Unveiling the potential of Graph Neural Networks for
network modeling and optimization in SDN," in Proceedings of ACM Symposium on SDN
Research (SOSR) , pp. 140-151, April 2019.
>>>>>
>>>>> Particularly, we trained RouteNet only with samples of the NSFNET
dataset to predict the delay and jitter. Then, we evaluate the accuracy of the models
already trained. This evaluation is made separately on the three datasets (NSFNET, GBN and
GEANT2) to test the generalization capability of the model.
>>>>>
>>>>> Please, find more details in Section 4 (Evaluation of the accuracy of
the GNN model) of the paper.
>>>>>
>>>>> Also, you can find information on how to process the datasets at the
following link:
>>>>>
>>>>>
http://knowledgedefinednetworking.org/data/README_gnn.pdf
>>>>>
>>>>>
>>>>> Regards,
>>>>>
>>>>> José
>>>>>
>>>>> El 22/09/2019 a las 16:55, Nathan Sowatskey escribió:
>>>>>> Hi
>>>>>>
>>>>>> On this page:
>>>>>>
>>>>>>
https://github.com/knowledgedefinednetworking/Unveiling-the-potential-of-GN…
>>>>>>
>>>>>> I have seen that there is this data set:
>>>>>>
>>>>>>
http://knowledgedefinednetworking.org/data/GBN.zip
>>>>>>
>>>>>> It is described as having been used for evaluation, but I can’t
find anything else that refers to it.
>>>>>>
>>>>>> Can anyone tell me more please?
>>>>>>
>>>>>> Many thanks
>>>>>>
>>>>>> Nathan
>>>>>> _______________________________________________
>>>>>> Kdn-users mailing list
>>>>>> Kdn-users(a)knowledgedefinednetworking.org
>>>>>>
https://mail.n3cat.upc.edu/cgi-bin/mailman/listinfo/kdn-users
>> _______________________________________________
>> Kdn-users mailing list
>> Kdn-users(a)knowledgedefinednetworking.org
>>
https://mail.n3cat.upc.edu/cgi-bin/mailman/listinfo/kdn-users