Getting the "Service Provider" by Title

DOORS is my OSLC Server with a lot of projects(Service Providers). And I want to get the OSLC Service Provider URL for a given project, i.e., getServiceProviderURLbyProjectTitle(String projectName). In your DOORS sample (DoorsOauthSample), you have method lookupServiceProviderUrl that takes forever to find my Project URL (Iterates through all the Projects).

My question is: Is there any way to run a query on the OSLC Server to return only a specific Project (Service Provider’s) URL for a project title?

Thanks in advance,


  // Step 1 Check if it is the service provider we are looking for by comparing the name
  ResIterator listResources = rdfModel.listResourcesWithProperty(RDF.type,rdfModel.createResource(""));
  Property titleProp = rdfModel.createProperty(OSLCConstants.DC,"title");
  //check each serviceProvider's title and match it to the one passed in
  while (listResources.hasNext()) {
  	Resource resource =;
  	Statement titlestatement = resource.getProperty(titleProp);
  	if (titlestatement == null)
  	String mytitle = titlestatement.getLiteral().getString();
  	if (( mytitle != null) && (mytitle.equalsIgnoreCase(serviceProviderTitle))) {
  		System.out.println("Project Found");
  		retval =  catalogUrl;

Please share some links so we can easily identify the code you are referring to.

This piece of code (Step 1) can certainly be improved. But I wonder if this is really what makes your lookup slow. It is simply looping through a list.

But if you look at the followup steps (Step 2 & Step 3, see link below), you will notice a couple of recursive calls within loops. That sounds very expensive.
return lookupServiceProviderUrl(newURL, serviceProviderTitle, client);

My guess is that this code is trying to be quite general, and cover many cases. Maybe you can do a more optimal look if you know exactly what DOORS return.
For example, what does the rdfModel return from DOORS? Does it have enough information for you to do the search you need?

Hello Peter,

OSLC 2 does not support filtering on the SP Catalog level, only on the SP level (through a Query Capability and the oslc.where query URI parameter). This feature is under discussion for OSLC 3 with me and David arguing strongly in favour of it and @jamsden and @ndjc having mixed opinions on this.

Nevertheless, I do tend to agree with Jad that it’s not the lack of filtering that slows your code down but all the other loops and calls. I just added some measurement and logging code, please try again and tell us how big are the delays. Please ensure FINEST logging level is enabled.

You were right the recursion is a killer

Note: Ran this on a server class machine.


private static AtomicInteger atom = new AtomicInteger(0);
// …
Instant start =;
//STEP 5: Find the OSLC Service Provider for the project area we want to work with
String serviceProviderUrl = lookupServiceProviderUrl(catalogUrl, "Services for " + projectArea, client);
Instant end =;
Duration timeElapsed = Duration.between(start, end);
System.out.println(“Time taken: “+ timeElapsed.toMillis() +” milliseconds”);
System.out.println("Number of times called: " + atom.get() );
// …
public static String lookupServiceProviderUrl(final String catalogUrl, final String serviceProviderTitle, final OslcOAuthClient client) throws IOException, OAuthException, URISyntaxException, ResourceNotFoundException


Time taken: 101872 milliseconds
Number of times called: 516

1 Like

Right, 500 HTTP requests can easily take 100s if ran sequentially against a heavy service that runs tons of SQL queries on each request.

Imperfect suggestions:

  • try to figure out whether it’s step 2 or 3 that actually finds your SP, then you can eliminate one of the recursive loops.
  • perform some memoisation/caching of the contents of each catalog URI. I guess SPs change much less frequently than resources in SPs in your system. You could use something in-mem or Redis.
  • run the calls asynchronously (see how I did it with POSTing to the Creation Factory here)

Do a GET on your ServiceProviderCatalog (SPC), and see if that response contains all the necessary information your code would need to find the desired ServiceProvider (SP).
Attached is an example SPC from a system I am using. As you can see, relevant SP information is inlined in the response. So, there is really no reason in this case to recursively requests info for each SP.
But of course, your system might not be providing this information. I am pretty sure it does though. Otherwise, we need to have a talk with the DOORS people.

1 Like

That would be the perfect way!

I am in favor of OSLC Query on OSLC discovery resources, and believe there is nothing in the OSLC 2.0 spec that rules this out - it’s just not something that has been done as yet.

Current implementations typically rely on some dominant resource container such as a Project Area that also acts as the unit for access control and user management. Service provider catalogs list these containers in line, and each Service Provider has all the services listed in line. Since the containers are quite course grained, this approach has been sufficient for many servers.

Some more details:
It takes about 500 milliseconds (1/2 second) to make an HTTP call. Five hundred milliseconds is probably longer than most Corporate Networks (LANs).

At the top-level provider catalogs, I have other nested service provider catalogs. It seems like DOORS uses provider catalogs (like a directory) to group both service providers and other provider catalogs to an arbitrary depth.

What I was hoping, is by calling (once) the top-level service provider catalog (/rdf:Description/jd:oslcCatalogs[1]/oslc:ServiceProviderCatalog[1]) I would get the entire RDF model back (in local memory) and then use SPARQL Query Language on the in-memory model.

As long as DOORS allows for the arbitrary nesting of catalogs and service providers, this is going to be a problem.


1 Like

The service provider (“title”) I am looking for is nested under other catalogs. So, this is not going to work.

That’s a pity. Have you considered doing a one-off expensive lookup to find the lower level Catalog, and then start any future dynamic lookups from that point onwards?

My current plan is to do the expensive search once and then store/cache the { title, URN }. Then always check the cache first given a Project (Service Provider) title. Thanks everyone.

1 Like

sounds reasonable.

@JankCode how did it go? Did you manage to achieve acceptable performance?