Job with date range executing but not creating an output
CompletedI submitted a job with a date range option and expected to see the specified date range added to the state, but it didn't. I found out that the problem was that I had selected process mode as "Add New and Replace Modified" instead of "Replace Date Range". It makes sense that "Add New and Replace Modified" would not work for a code process that is connecting to an API connection and the system doesn't know what is new and what is modified. Two questions:
1. Shouldn't "Add New and Replace Modified" be option only be allowed when it's possible for system to know if there is any new or modified data?
2. At a minimum shouldn't the default be changed to "Replace Date Range" to chances being effected by this usability issue?
-
When Input is connection, some of the process modes are not applicable.
For example, for Matomo Loader, when Input is Matomo Api connection, NEW_ONLY, ADD_NEW_REPLACE_MODIFIED are not applicable.
We have a story in 6.3.0 to correct this, SMA-5605.
In the future releases, for all the syntasa processes, we will have similar rules to restrict them selves to only applicable process modes.
But when Input is a connection and it is a spark code process, we have to assume, anything is possible with code process and can not have similar restrictions. It is possible that user can write some code to determine which dates are new and which dates are modified by making additional api requests to input api connection (if input api supports such mechanism).
Please sign in to leave a comment.
Comments
1 comment