I have IFS Cloud Installation Using 22R1 using remote deployment on Azure Storage.
If there any way to use Azure Storage for DocMan. I have tired using Shared Path but seems its working.
in This i have created Repositotary Address using \\<ServerName>\<path mapped on the server
given user name password for the server.
Added Same Address in Repositotary and used port 22 (SCP) as the azure storage path is mapped to linux server.
But While check in the document its failing to connet to the Respositotary.
Please suggest. Storing documents on Database is not a very good option.
Regards
Pankaj
Page 1 / 1
In theory, this should be possible if you can ensure that there is network connectivity between the k8s cluster and the shared folder. It's important to make sure k8s (linkerd) is also configured to accept SMB traffic. The port number you specified will NOT be taken into account, it's only used for FTP communication.
Hello @Mathias Dahl ,
I managed to configure FTP on azure storage using IIS on WMS server. will share detailed document how i did not.
Now while checking document. I dont get any error but the document does not store on location.
Do you know that the Kubernetes cluster can reach the FTP server? If yes, how do you know that? How does the basic data of your Docman repository look like?
Hello,
I could login to FTP server from MT server
Log shows that it could connect to FTP Server . Check logs below
Which throws a null pointer exception. This suggests the stream is null ("in" above). Why would it be null if there is a successful FTP connection? Perhaps something closed it before? Either our own code has closed the connection or some other code did it...
We could try to add a check in our Java code, to see if the stream is not null before we close it. Then again, we might not be able to recreate this internally since, for us, FTP is working. So we would do that fix "blindly" (fixing something without being able to recreate the problem), which is never good.
I'll discuss this internally and see if we can come up with some theory.
Thanks @Mathias Dahl ,
Look forward. seems I am there but not there :(
Regards
Pankaj
Hi @Mathias Dahl
I saw the same error in another customer (22R2 remote deployed) when they try to view a document which is saved in FTP. You cannot see any errors in the client during upload but you get a server error when you try to view the file. Below error is in the ODATA logs which is the same error as above.
I didn't receive feedback regarding FTP logs and the connectivity between k8s cluster and FTP server but since above log shows FTP code 220 (220 code is sent in response to a new user connecting to the FTP server to indicate that the server is ready for the new client) I am not sure it is due to a connectivity problem.
Check the FTP server logs to see how the communication looks like. In the server I use I can see every command, the whole interaction actually.