I have the following LINQ query:
var aKeyword = \"ACT\";
var results = from a in db.Activities
where a.Keywords.Split(\',\').Contains(aKeyword)
In response to your performance considerations on a big dataset:
You are going to be doing non indexed wildcard string matching on the client, so yes, there will be performance loss.
Is there a reason why you have multiple keywords in one table field? You could normalize that out, to have a ActivityKeywords table where for each Activity you have a number of Keyword records.
Activities(activity_id, ... /* remove keywords field */) ---> ActivityKeywords(activity_id, keyword_id) ---> Keywords(keyword_id, value)
Check out Non-first normal form: http://en.wikipedia.org/wiki/Database_normalization
EDIT: Also even if you were to stick with the one column, there is a way to do everything serverside (if you have a strict syntax: 'keyword1, keyword2, ..., keywordN'):
var aKeyword = "ACT";
var results = (from a in db.Activities
where a.Keywords.Contains("," + aKeyword) || a.Keywords.Contains(aKeyword + ",")
select a;