Book Title: Proceedings of the 20th International FLAIRS Conference
Date: February 14, 2007
Abstract: Web search engines like Google have made us all smarter by providing ready access to the world's knowledge whenever we need to look up a fact, learn about a topic or evaluate opinions. The W3C's Semantic Web effort aims to make such knowledge more accessible to computer programs by publishing it in machine understandable form. As the volume of Semantic Web data grows, software agents will need their own search engines to help them find the relevant and trustworthy knowledge they need to perform their tasks. We will discuss the general issues underlying the indexing and retrieval of RDF-based information and describe Swoogle, a crawler based search engine whose index contains information on over two million RDF documents, and TripleShop, which uses Swoogle to automatically build datasets appropriate for responding to user supplied queries. We will illustrate their use in ELVIS (Ecosystem Location Visualization and Information System), a distributed platform for constructing end-to-end use cases that demonstrate the semantic web's utility for integrating scientific data.
Type: InProceedings
Tags: ecoinformatics, rdf, semantic web, swoogle
Google Scholar: search
Attachments:
345.pdf | downloads: 1191 |