Institutional Repositories · 19
eight (23%) “very difficult ” five (14%) were neutral.
In other words, nearly two-thirds of implementers
surveyed were sufficiently challenged by the task
of recruitment to label their efforts “difficult.”
By contrast, about half of the responding plan-
ners were neutral. The remainder were evenly di-
vided between “easy” and “difficult.”
This difference in perceptions between imple-
menters and planners may reflect both a simple dif-
ference in experience and/or the change in percep-
tions of implementers after an initial recruitment
phase of easily identified departmental content. It
may be that it becomes increasingly difficult to re-
cruit content after this initial set of objects is added
to the IR.
A variety of recruitment strategies are em-
ployed or planned by respondents. The majority
have tried subject specialist advocacy, identifying
likely depositors, presentations to faculty, and of-
fering to deposit electronic materials for authors.
Implementers appear to be more aggressive with
additional strategies, such as sending electronic an-
nouncements, faculty co-recruiting, offering to dig-
itize and deposit printed material for authors, and
holding awareness-raising symposia. This practice
may indicate that implementers have reacted to
recruitment difficulties by trying more and more
recruitment strategies.
One recruitment strategy not mentioned above
is institutional pressure on authors to submit con-
tent to IRs. Only one implementer requires authors
to submit content to the IR. One implementer and
one planner are considering such a requirement.
Half of the implementers and two-thirds of the
planners report there is no pressure on authors to
submit content. The rest encourage, but do not re-
quire, authors to submit content.
Assessment
A small number of implementers (8 or 22%) have
conducted research on why users do or do not
contribute to the IR only five planners (28%) have
decided to conduct any research. This seems odd
since the success of an IR is highly dependant on
users contributing to the IR. One explanation for
this might be that about a third of the implement-
ers and 71% of the planners answered that they
had not yet reached the assessment phase. Because
few institutions have conducted assessment of
contributor motivation, there is likely to be limited
data regarding what factors influence users who
contribute to repositories.
While close to 70% of the implementers who
have done some form of assessment of the success
of the IR have gathered direct feedback from IR us-
ers through interviews, surveys, or focus groups,
the majority (23 or 79%) have tracked hits on IR
content. This is likely due to the fact that it is fairly
simple to collect “hit” data from server log files,
while the collection and analysis process for more
ethnographic user data is significantly more time
consuming.
It is clear from the comments that there are many
different viewpoints on what constitutes “success”
for a repository. One respondent commented about
assessing the usability of the interface, while anoth-
er responded about counting full-text downloads.
Clearly, there are many aspects of an IR which need
to be examined to determine success.
Current Status of IR
Because the survey respondents have repositories
at various stages of development, the numbers
of digital objects in the IRs differ significantly.
Implementers report a range of 20 objects to over
19,000. Planners report between 4 and 4,500 ob-
jects in their repositories. Interestingly, not all the
materials stored in the repositories are available to
everyone. Forty-four percent of the implementers
(16) have material within their repository that is
available to only a specific user group, while 36%
of the planners (5) intend to restrict access to parts
of their IR to specific groups.
Comments from the respondents indicate that
there are different reasons for these restrictions as
well as different groups to whom use is being re-
eight (23%) “very difficult ” five (14%) were neutral.
In other words, nearly two-thirds of implementers
surveyed were sufficiently challenged by the task
of recruitment to label their efforts “difficult.”
By contrast, about half of the responding plan-
ners were neutral. The remainder were evenly di-
vided between “easy” and “difficult.”
This difference in perceptions between imple-
menters and planners may reflect both a simple dif-
ference in experience and/or the change in percep-
tions of implementers after an initial recruitment
phase of easily identified departmental content. It
may be that it becomes increasingly difficult to re-
cruit content after this initial set of objects is added
to the IR.
A variety of recruitment strategies are em-
ployed or planned by respondents. The majority
have tried subject specialist advocacy, identifying
likely depositors, presentations to faculty, and of-
fering to deposit electronic materials for authors.
Implementers appear to be more aggressive with
additional strategies, such as sending electronic an-
nouncements, faculty co-recruiting, offering to dig-
itize and deposit printed material for authors, and
holding awareness-raising symposia. This practice
may indicate that implementers have reacted to
recruitment difficulties by trying more and more
recruitment strategies.
One recruitment strategy not mentioned above
is institutional pressure on authors to submit con-
tent to IRs. Only one implementer requires authors
to submit content to the IR. One implementer and
one planner are considering such a requirement.
Half of the implementers and two-thirds of the
planners report there is no pressure on authors to
submit content. The rest encourage, but do not re-
quire, authors to submit content.
Assessment
A small number of implementers (8 or 22%) have
conducted research on why users do or do not
contribute to the IR only five planners (28%) have
decided to conduct any research. This seems odd
since the success of an IR is highly dependant on
users contributing to the IR. One explanation for
this might be that about a third of the implement-
ers and 71% of the planners answered that they
had not yet reached the assessment phase. Because
few institutions have conducted assessment of
contributor motivation, there is likely to be limited
data regarding what factors influence users who
contribute to repositories.
While close to 70% of the implementers who
have done some form of assessment of the success
of the IR have gathered direct feedback from IR us-
ers through interviews, surveys, or focus groups,
the majority (23 or 79%) have tracked hits on IR
content. This is likely due to the fact that it is fairly
simple to collect “hit” data from server log files,
while the collection and analysis process for more
ethnographic user data is significantly more time
consuming.
It is clear from the comments that there are many
different viewpoints on what constitutes “success”
for a repository. One respondent commented about
assessing the usability of the interface, while anoth-
er responded about counting full-text downloads.
Clearly, there are many aspects of an IR which need
to be examined to determine success.
Current Status of IR
Because the survey respondents have repositories
at various stages of development, the numbers
of digital objects in the IRs differ significantly.
Implementers report a range of 20 objects to over
19,000. Planners report between 4 and 4,500 ob-
jects in their repositories. Interestingly, not all the
materials stored in the repositories are available to
everyone. Forty-four percent of the implementers
(16) have material within their repository that is
available to only a specific user group, while 36%
of the planners (5) intend to restrict access to parts
of their IR to specific groups.
Comments from the respondents indicate that
there are different reasons for these restrictions as
well as different groups to whom use is being re-